Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open Science in (Higher) Education – data of the February 2017 survey
This data set contains:
Survey structure
The survey includes 24 questions and its structure can be separated in five major themes: material used in courses (5), OER awareness, usage and development (6), collaborative tools used in courses (2), assessment and participation options (5), demographics (4). The last two questions include an open text questions about general issues on the topics and singular open education experiences, and a request on forwarding the respondent’s e-mail address for further questionings. The online survey was created with Limesurvey[1]. Several questions include filters, i.e. these questions were only shown if a participants did choose a specific answer beforehand ([n/a] in Excel file, [.] In SPSS).
Demographic questions
Demographic questions asked about the current position, the discipline, birth year and gender. The classification of research disciplines was adapted to general disciplines at German higher education institutions. As we wanted to have a broad classification, we summarised several disciplines and came up with the following list, including the option “other” for respondents who do not feel confident with the proposed classification:
The current job position classification was also chosen according to common positions in Germany, including positions with a teaching responsibility at higher education institutions. Here, we also included the option “other” for respondents who do not feel confident with the proposed classification:
We chose to have a free text (numerical) for asking about a respondent’s year of birth because we did not want to pre-classify respondents’ age intervals. It leaves us options to have different analysis on answers and possible correlations to the respondents’ age. Asking about the country was left out as the survey was designed for academics in Germany.
Remark on OER question
Data from earlier surveys revealed that academics suffer confusion about the proper definition of OER[2]. Some seem to understand OER as free resources, or only refer to open source software (Allen & Seaman, 2016, p. 11). Allen and Seaman (2016) decided to give a broad explanation of OER, avoiding details to not tempt the participant to claim “aware”. Thus, there is a danger of having a bias when giving an explanation. We decided not to give an explanation, but keep this question simple. We assume that either someone knows about OER or not. If they had not heard of the term before, they do not probably use OER (at least not consciously) or create them.
Data collection
The target group of the survey was academics at German institutions of higher education, mainly universities and universities of applied sciences. To reach them we sent the survey to diverse institutional-intern and extern mailing lists and via personal contacts. Included lists were discipline-based lists, lists deriving from higher education and higher education didactic communities as well as lists from open science and OER communities. Additionally, personal e-mails were sent to presidents and contact persons from those communities, and Twitter was used to spread the survey.
The survey was online from Feb 6th to March 3rd 2017, e-mails were mainly sent at the beginning and around mid-term.
Data clearance
We got 360 responses, whereof Limesurvey counted 208 completes and 152 incompletes. Two responses were marked as incomplete, but after checking them turned out to be complete, and we added them to the complete responses dataset. Thus, this data set includes 210 complete responses. From those 150 incomplete responses, 58 respondents did not answer 1st question, 40 respondents discontinued after 1st question. Data shows a constant decline in response answers, we did not detect any striking survey question with a high dropout rate. We deleted incomplete responses and they are not in this data set.
Due to data privacy reasons, we deleted seven variables automatically assigned by Limesurvey: submitdate, lastpage, startlanguage, startdate, datestamp, ipaddr, refurl. We also deleted answers to question No 24 (email address).
References
Allen, E., & Seaman, J. (2016). Opening the Textbook: Educational Resources in U.S. Higher Education, 2015-16.
First results of the survey are presented in the poster:
Heck, Tamara, Blümel, Ina, Heller, Lambert, Mazarakis, Athanasios, Peters, Isabella, Scherp, Ansgar, & Weisel, Luzian. (2017). Survey: Open Science in Higher Education. Zenodo. http://doi.org/10.5281/zenodo.400561
Contact:
Open Science in (Higher) Education working group, see http://www.leibniz-science20.de/forschung/projekte/laufende-projekte/open-science-in-higher-education/.
[1] https://www.limesurvey.org
[2] The survey question about the awareness of OER gave a broad explanation, avoiding details to not tempt the participant to claim “aware”.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The questions list for questionnaire – Demographics and basic work characteristics of survey respondents
https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html
Professional organizations in STEM (science, technology, engineering, and mathematics) can use demographic data to quantify recruitment and retention (R&R) of underrepresented groups within their memberships. However, variation in the types of demographic data collected can influence the targeting and perceived impacts of R&R efforts - e.g., giving false signals of R&R for some groups. We obtained demographic surveys from 73 U.S.-affiliated STEM organizations, collectively representing 712,000 members and conference-attendees. We found large differences in the demographic categories surveyed (e.g., disability status, sexual orientation) and the available response options. These discrepancies indicate a lack of consensus regarding the demographic groups that should be recognized and, for groups that are omitted from surveys, an inability of organizations to prioritize and evaluate R&R initiatives. Aligning inclusive demographic surveys across organizations will provide baseline data that can be used to target and evaluate R&R initiatives to better serve underrepresented groups throughout STEM. Methods We surveyed 164 STEM organizations (73 responses, rate = 44.5%) between December 2020 and July 2021 with the goal of understanding what demographic data each organization collects from its constituents (i.e., members and conference-attendees) and how the data are used. Organizations were sourced from a list of professional societies affiliated with the American Association for the Advancement of Science, AAAS, (n = 156) or from social media (n = 8). The survey was sent to the elected leadership and management firms for each organization, and follow-up reminders were sent after one month. The responding organizations represented a wide range of fields: 31 life science organizations (157,000 constituents), 5 mathematics organizations (93,000 constituents), 16 physical science organizations (207,000 constituents), 7 technology organizations (124,000 constituents), and 14 multi-disciplinary organizations spanning multiple branches of STEM (131,000 constituents). A list of the responding organizations is available in the Supplementary Materials. Based on the AAAS-affiliated recruitment of the organizations and the similar distribution of constituencies across STEM fields, we conclude that the responding organizations are a representative cross-section of the most prominent STEM organizations in the U.S. Each organization was asked about the demographic information they collect from their constituents, the response rates to their surveys, and how the data were used. Survey description The following questions are written as presented to the participating organizations. Question 1: What is the name of your STEM organization? Question 2: Does your organization collect demographic data from your membership and/or meeting attendees? Question 3: When was your organization’s most recent demographic survey (approximate year)? Question 4: We would like to know the categories of demographic information collected by your organization. You may answer this question by either uploading a blank copy of your organization’s survey (linked provided in online version of this survey) OR by completing a short series of questions. Question 5: On the most recent demographic survey or questionnaire, what categories of information were collected? (Please select all that apply)
Disability status Gender identity (e.g., male, female, non-binary) Marital/Family status Racial and ethnic group Religion Sex Sexual orientation Veteran status Other (please provide)
Question 6: For each of the categories selected in Question 5, what options were provided for survey participants to select? Question 7: Did the most recent demographic survey provide a statement about data privacy and confidentiality? If yes, please provide the statement. Question 8: Did the most recent demographic survey provide a statement about intended data use? If yes, please provide the statement. Question 9: Who maintains the demographic data collected by your organization? (e.g., contracted third party, organization executives) Question 10: How has your organization used members’ demographic data in the last five years? Examples: monitoring temporal changes in demographic diversity, publishing diversity data products, planning conferences, contributing to third-party researchers. Question 11: What is the size of your organization (number of members or number of attendees at recent meetings)? Question 12: What was the response rate (%) for your organization’s most recent demographic survey? *Organizations were also able to upload a copy of their demographics survey instead of responding to Questions 5-8. If so, the uploaded survey was used (by the study authors) to evaluate Questions 5-8.
https://www.icpsr.umich.edu/web/ICPSR/studies/33321/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/33321/terms
The University of Washington - Beyond High School (UW-BHS) project surveyed students in Washington State to examine factors impacting educational attainment and the transition to adulthood among high school seniors. The project began in 1999 in an effort to assess the impact of I-200 (the referendum that ended Affirmative Action) on minority enrollment in higher education in Washington. The research objectives of the project were: (1) to describe and explain differences in the transition from high school to college by race and ethnicity, socioeconomic origins, and other characteristics, (2) to evaluate the impact of the Washington State Achievers Program, and (3) to explore the implications of multiple race and ethnic identities. Following a successful pilot survey in the spring of 2000, the project eventually included baseline and one-year follow-up surveys (conducted in 2002, 2003, 2004, and 2005) of almost 10,000 high school seniors in five cohorts across several Washington school districts. The high school senior surveys included questions that explored students' educational aspirations and future career plans, as well as questions on family background, home life, perceptions of school and home environments, self-esteem, and participation in school related and non-school related activities. To supplement the 2000, 2002, and 2003 student surveys, parents of high school seniors were also queried to determine their expectations and aspirations for their child's education, as well as their own educational backgrounds and fields of employment. Parents were also asked to report any financial measures undertaken to prepare for their child's continued education, and whether the household received any form of financial assistance. In 2010, a ten-year follow-up with the 2000 senior cohort was conducted to assess educational, career, and familial outcomes. The ten year follow-up surveys collected information on educational attainment, early employment experiences, family and partnership, civic engagement, and health status. The baseline, parent, and follow-up surveys also collected detailed demographic information, including age, sex, ethnicity, language, religion, education level, employment, income, marital status, and parental status.
The Education Experience Survey and Literacy Assessment was conducted in Shefa Province, Vanuatu in April, 2011 for Ni-Vanuatu aged from 15 to 60 years. The full report analyses in detail the results of the survey and literacy assessment and highlights correlations between respondents’ educational experience and their literacy levels, employment and income. The survey was aimed at rural Shefa Province so did not cover the capital Port Vila.
The survey and literacy assessment instrument and methodology has been designed to collect accurate and staitsially significant information about education and language experience and also assess acutul literacy levels at the provincial, village and individual level.
The results provide accurate, statistically significant primary data about the education experience of Ni-Vanuatu in Shefa Province.
Shefa Province, Vanuatu not including Port Vila. Eight out of the total of 15 islands within the Province mapping were randomly selected.
The villages surveyed were located on islands of Efate, Lelepa, Nguna, Emau, Emae, Buninga, Tongoa, and Laman Island (Epi). The villages in which the survey took place were Mele, Emua, Takara/Sara, Ekipe, Pangpang, Eton, Teoma, Etas, Lelepa,Utanlang, Taloa, Marou, Wiana, Buninga, Tongamea, Sangava, Euta, Matangi/Itakoma, Lumbukiti and Laman.
Household Individual (Ni-Vanuatu aged from 15 to 60 years).
The survey covered all people who normally resided in a selected household, between the ages of 15 and 60 years (inclusive).
Sample survey data [ssd]
The survey was conducted in households in randomly selected rural communities across 8 systematically selected islands out of the 15 islands of Shefa Province. Port Vila was consciously not included, but such a similar exercise in Port Vila would also be very worthwhile. Eight out of the total of 15 islands within the Province mapping were randomly selected. All people who normally resided in a selected household, between the ages of 15 and 60 years (inclusive), were invited to participate in the survey.
The literacy assessment questions were addressed only to respondents who declared an ability to read one of the official languages - English, French or Bislama. With regard to the sampling methodology, great care was taken to ensure that statistically significant results were obtained. The minimum required sample size was calculated using 2009 National Census population figures that indicated the total target population - those people between the ages of 15 to 60 - to be 57,174. The required sample size was 2.36% of the total population, meaning that the number of respondents required was 1,350 people.
This minimum sample size was then used to guide the number of households that needed to be surveyed. It was assumed that a household would typically contain at least three eligible people (15-60 years). As such, it was planned that 20 villages, with 30 households within each village, and an average of three people per household should be interviewed.
There was no deviation from sample design.
Face-to-face [f2f]
The survey instrument contains five sections as follows: 1. Individual profile 2. Education experience 3. Language experience 4. Literacy assessment 5. Employment experience
The Individual Profile section of the survey was designed to capture information about the respondents’ gender and age, to allow disaggregation analysis. The first section of the survey also included questions relating to the respondents’ number of children, sources of information used in the previous month, and the respondents’ attitudes to literacy and education.
The second and third parts of the survey were designed to capture information about the respondents’ educational and language experience. The questions in the second part of the survey, explored the education history of the individual, including the highest level of schooling attended and attained, as well as reasons behind non-completion where appropriate.
The third part of the survey questionnaire explored respondents’ language preferences in different situations, and asked respondents to self-declare their literacy status.
The fourth part of the survey is the literacy assessment, which was administered to those participants who self-declared an ability to read one of the three official languages - English, French or Bislama. Therefore, those respondents who indicated in Part 3 that they could read easily, or read some of their preferred official language, participated in the literacy assessment. In contrast, those respondents who indicated that they could not read one of the official languages, did not undertake the literacy assessment and were classified as nonliterate.
The fifth part of the survey looked at the employment experience of respondents. It was designed to extract information about individuals’ participation in the formal economy through cash -paying employment.
The survey results were encoded using the Census & Survey Processing System (CSPro) and the data was analysed using the Statistical Package for the Social Sciences (SPSS). For further explanatory notes on the survey analysis, see Appendix C of the External Resource entitled Vanuatu Rural ShefaProvince Education Experience Survey and Literacy Assessment Report
100% response rate.
The required sample size was 2.36% of the total population, meaning that the number of respondents required was 1,350 however of the1,350 households selected for the sample, in Shefa Province 1475 interviews were conducted, which is above the minimum sample size of 1,350 people. The survey sample comprised 628 males (42.6%) and 846 females (57.4%). All respondents were between the ages of 15 and 60 years, so as to encompass both the youth and adult demographic.
This data collection contains information gathered in the Survey of Income and Education (SIE) conducted in April-July 1976 by the Census Bureau for the United States Department of Health, Education, and Welfare (HEW). Although national estimates of the number of children in poverty were available each year from the Census Bureau's Current Population Survey (CPS), those estimates were not statistically reliable on a state-by-state basis. In enacting the Educational Amendments of 1974, Congress mandated that HEW conduct a survey to obtain reliable state-by-state data on the numbers of school-age children in local areas with family incomes below the federal poverty level. This was the statistic that determined the amount of grant a local educational agency was entitled to under Title 1, Elementary and Secondary Education Act of 1965. (Such funds were distributed by HEW's Office of Education.) The SIE was the survey created to fulfill that mandate. Its questions include those used in the Current Population Survey regarding current employment, past work experience, and income. Additional questions covering school enrollment, disability, health insurance, bilingualism, food stamp recipiency, assets, and housing costs enabled the study of the poverty concept and of program effectiveness in reaching target groups. Basic household information also was recorded, including tenure of unit (a determination of whether the occupants of the living quarters owned, rented, or occupied the unit without rent), type of unit, household language, and for each member of the household: age, sex, race, ethnicity, marital history, and education.
School enrollment data are used to assess the socioeconomic condition of school-age children. Government agencies also require these data for funding allocations and program planning and implementation.
Data on school enrollment and grade or level attending were derived from answers to Question 10 in the 2015 American Community Survey (ACS). People were classified as enrolled in school if they were attending a public or private school or college at any time during the 3 months prior to the time of interview. The question included instructions to “include only nursery or preschool, kindergarten, elementary school, home school, and schooling which leads to a high school diploma, or a college degree.” Respondents who did not answer the enrollment question were assigned the enrollment status and type of school of a person with the same age, sex, race, and Hispanic or Latino origin whose residence was in the same or nearby area.
School enrollment is only recorded if the schooling advances a person toward an elementary school certificate, a high school diploma, or a college, university, or professional school (such as law or medicine) degree. Tutoring or correspondence schools are included if credit can be obtained from a public or private school or college. People enrolled in “vocational, technical, or business school” such as post secondary vocational, trade, hospital school, and on job training were not reported as enrolled in school. Field interviewers were instructed to classify individuals who were home schooled as enrolled in private school. The guide sent out with the mail questionnaire includes instructions for how to classify home schoolers.
Enrolled in Public and Private School – Includes people who attended school in the reference period and indicated they were enrolled by marking one of the questionnaire categories for “public school, public college,” or “private school, private college, home school.” The instruction guide defines a public school as “any school or college controlled and supported primarily by a local, county, state, or federal government.” Private schools are defined as schools supported and controlled primarily by religious organizations or other private groups. Home schools are defined as “parental-guided education outside of public or private school for grades 1-12.” Respondents who marked both the “public” and “private” boxes are edited to the first entry, “public.”
Grade in Which Enrolled – From 1999-2007, in the ACS, people reported to be enrolled in “public school, public college” or “private school, private college” were classified by grade or level according to responses to Question 10b, “What grade or level was this person attending?” Seven levels were identified: “nursery school, preschool;” “kindergarten;” elementary “grade 1 to grade 4” or “grade 5 to grade 8;” high school “grade 9 to grade 12;” “college undergraduate years (freshman to senior);” and “graduate or professional school (for example: medical, dental, or law school).”
In 2008, the school enrollment questions had several changes. “Home school” was explicitly included in the “private school, private college” category. For question 10b the categories changed to the following “Nursery school, preschool,” “Kindergarten,” “Grade 1 through grade 12,” “College undergraduate years (freshman to senior),” “Graduate or professional school beyond a bachelor’s degree (for example: MA or PhD program, or medical or law school).” The survey question allowed a write-in for the grades enrolled from 1-12.
Question/Concept History – Since 1999, the ACS enrollment status question (Question 10a) refers to “regular school or college,” while the 1996-1998 ACS did not restrict reporting to “regular” school, and contained an additional category for the “vocational, technical or business school.” The 1996-1998 ACS used the educational attainment question to estimate level of enrollment for those reported to be enrolled in school, and had a single year write-in for the attainment of grades 1 through 11. Grade levels estimated using the attainment question were not consistent with other estimates, so a new question specifically asking grade or level of enrollment was added starting with the 1999 ACS questionnaire.
Limitation of the Data – Beginning in 2006, the population universe in the ACS includes people living in group quarters. Data users may see slight differences in levels of school enrollment in any given geographic area due to the inclusion of this population. The extent of this difference, if any, depends on the type of group quarters present and whether the group quarters population makes up a large proportion of the total population. For example, in areas that are home to several colleges and universities, the percent of individuals 18 to 24 who were enrolled in college or graduate school would increase, as people living in college dormitories are now included in the universe.
In 2004, the Bank, jointly with other donors and the Government of Mozambique, prepared a Poverty and Social Impact Analysis on the issue of fee reform in primary school. Partly as a result of the study findings, the Government took the step of abolishing tuition fees in primary education. In 2006, Ministry of Education and Culture (MEC ) requested a repeat of this analysis, as well as a similar baseline study on barriers to enrollment for the poor in secondary education. In particular the MEC sought World Bank assistance in (a) evaluating the success of the reforms in primary education financing to date, and (b) formulating new policies and initiatives to reduce the barriers the poorest households face in accessing primary and secondary education. This panel survey is part of the Bank's response to this request.
Nationally representative
individuals, households
The survey was designed to target eligible children/student (i.e. children aged 0-17 y.o. in 2003 or members enrolled in school in 2003) from the IAF sample.
Sample survey data [ssd]
The Education Outcomes Panel Survey (NPS) was designed as a panel survey based on a subsample of households interviewed in the 2002/03 Inquérito aos Agregados Familiares (IAF), a national household income and expenditure survey conducted in all provinces of Mozambique from July 2002 to June 2003. The NPS data collection took place from September 2008 to February 2009 and it was performed by a contractor in Mozambique (KPMG), with World Bank and UNICEF field supervision.
The NPS sampling frame consists of enumeration areas (EA) that were drawn to correspond to a particular set of months of the 2002/03 IAF, namely March to May 2003, since it is expected that the IAF has a nationally representative subsample of EAs assigned each quarter. It is important to highlight that the NPS data is nationally representative at the rural and urban areas, but not representative below this level. The main reason is that the IAF sample was clustered to maximize efficiency in the data collection process across a 12 month period, while the NPS sample, due to costs constrains, includes only 3 months. Therefore, the NPS sample does not have enough geographic dispersion to be representative at the province level or below.
All IAF households in the enumeration areas during the months of March-May were included in the NPS sample, resulting in 221 EAs and 2,234 households. This sampling strategy was chosen to reduce the effect of seasonality in the panel analysis when comparing the 2002/03 IAF data to the 2008 NPS data for the same sample households. Originally it was planned to interview all the IAF sample households in these EAs during the same month in which they had been interviewed for the 2002/03 IAF. However, because of delays in the survey planning process, the data collection for the NPS was postponed took place from September 2008 to February 2009. The survey was designed to target eligible children/student (i.e. children aged 0-17 y.o. in 2003 or members enrolled in school in 2003) from the IAF sample. The households in the NPS sample were divided into 2 categories based on their status in 2003:
A. Target 2003 households. These are households that meet at least one of the following criteria: · Households that had at least one child 0-17 years-old in 2003 (see question a13 in the questionnaire ) · Households that had someone in primary or secondary school in 2003, in spite of age (see question a14 in the questionnaire) B. Alternate 2003 Households (14% of original NPS sample)
For the households that did not have any children or student in 2003 but were part of the IAF sample and were in the NPS enumeration area, the following two questions were asked to the first person who was found in the alternate household in 2008: · Does this person's 2008 household currently have anyone who is between 5 and 17 years of age? (see question a15 in the questionnaire) · Does this person's 2008 household currently have anyone who attending primary or secondary school? (see question a16 in the questionnaire)
If the answer was YES to either question (a15 or a16), the interviewer proceeded with the entire questionnaire. If the answer was NO to both questions, the interviewer stopped the interview.
In sum, target households are the source for the panel of children, while alternate households were included to supplement sample size.
There were two types of tracking in the NPS, that of households and that of children/students who split from the original 2003 household and joined new households in 2008. If the entire 2003 household moved in 2008, the field team would gather their new contact information with local leaders, neighbors, friend, etc and follow and interview the household at their new location, provided the household moved within the district (the survey only followed households/children that moved within the district level). New members of the household were also included in the interview.
If the 2003 household was split in 2008 and the members who moved out were a target member (children/student in 2003) who had moved within the district, then the team followed the individuals and interviewed both the original household (if a target member still lived there) and the split household.
The screening for tracking are in section B1 of the questionnaire. A member would be tracked if b100a =1 (this variable is an indicator of whether the member was target member, i.e. less than 17 y.o. in 2003 or attending school in 2003), and b108=2 (the member no longer lives in the household), and b111 <=2 (the member moved to the same village or district). If all these conditions were met, questions B112 (should the member be tracked?) should be 1 (YES) and the household should be followed. The variable "sp" indicates whether the household was the original (sp=0) or a split household (sp>=1).
In case all target members (b100a=1) moved out of the household, the interviewer should end the interview with the original household at question B114.
Face-to-face [f2f]
Inquérito aos Agregados Familiares (IAF) 2002 -2003
NPS Survey 2008-2009
General Household Questionnaire: modules A, B0, B1, B2 (demographics), C0 (education), D0 D1, D2, D3 (employment), E (household characteristics), H (education quality perception), I (transfers) - Consumption module: modules F, GA, GB, GC, GD - Education Event History Module: module C1 - Education Expenditure Modules: module C2
The City of Norfolk is committed to using data to inform decisions and allocate resources. An important source of data is input from residents about their priorities and satisfaction with the services we provide. Norfolk last conducted a citywide survey of residents in 2022.
To provide up-to-date information regarding resident priorities and satisfaction, Norfolk contracted with ETC Institute to conduct a survey of residents. This survey was conducted in May and June 2024; surveys were sent via the U.S. Postal Service, and respondents were given the choice of responding by mail or online. This survey represents a random and statistically valid sample of residents from across the city, including each Ward. ETC Institute monitored responses and followed up to ensure all sections of the city were represented. Additionally, an opportunity was provided for residents not included in the random sample to take the survey and express their views. This dataset includes all random sample survey data including demographic information; it excludes free-form comments to protect privacy. It is grouped by Question Category, Question, Response, Demographic Question, and Demographic Question Response. This dataset will be updated every two years.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Sexual, romantic, and related orientations across all institutions, based on the queered survey (n = 1932).
https://www.icpsr.umich.edu/web/ICPSR/studies/36067/termshttps://www.icpsr.umich.edu/web/ICPSR/studies/36067/terms
The Fast Response Survey System (FRSS) was established in 1975 by the National Center for Education Statistics (NCES), United States Department of Education. FRSS is designed to collect issue-oriented data within a relatively short time frame. FRSS collects data from state education agencies, local education agencies, public and private elementary and secondary schools, public school teachers, and public libraries. To ensure minimal burden on respondents, the surveys are generally limited to three pages of questions, with a response burden of about 30 minutes per respondent. Sample sizes are relatively small (usually about 1,000 to 1,500 respondents per survey) so that data collection can be completed quickly. Data are weighted to produce national estimates of the sampled education sector. The sample size is large enough to permit limited breakouts by classification variables. However, as the number of categories within the classification variables increases, the sample size within categories decreases, which results in larger sampling errors for the breakouts by classification variables. The Elementary School Arts Education Survey, Fall 2009 data provide national estimates on student access to arts education and resources available for such instruction in public elementary schools during fall 2009. This is one of a set of seven surveys that collected data on arts education during the 2009-10 school year. In addition to this survey, the set includes a survey of secondary school principals, three elementary teacher-level surveys, and two secondary teacher-level surveys. A stratified sample design was used to select principals for this survey. Data collection was conducted September 2009 through June 2010, and 988 eligible principals completed the survey by web, mail, fax, or telephone. The elementary school survey collected data on the availability and characteristics of music, visual arts, dance, and drama/theatre instruction; the type of space used for arts instruction; the availability of curriculum guides for arts teachers to follow; the availability of curriculum-based arts education activities outside of regular school hours; and whether those teaching the subject are arts specialists. Principals also reported on school or district provision of teacher professional development in the arts; arts education programs, activities, and events; and school-community partnerships. Principals were also asked to provide administrative information such as school instructional level, school enrollment size, community type, and percent of students eligible for free or reduced-price lunch.
The Annual Population Survey (APS) is a major survey series, which aims to provide data that can produce reliable estimates at local authority level. Key topics covered in the survey include education, employment, health and ethnicity. The APS comprises key variables from the Labour Force Survey (LFS) (held at the UK Data Archive under GN 33246), all of its associated LFS boosts and the APS boost. Thus, the APS combines results from five different sources: the LFS (waves 1 and 5); the English Local Labour Force Survey (LLFS), the Welsh Labour Force Survey (WLFS), the Scottish Labour Force Survey (SLFS) and the Annual Population Survey Boost Sample (APS(B) - however, this ceased to exist at the end of December 2005, so APS data from January 2006 onwards will contain all the above data apart from APS(B)). Users should note that the LLFS, WLFS, SLFS and APS(B) are not held separately at the UK Data Archive. For further detailed information about methodology, users should consult the Labour Force Survey User Guide, selected volumes of which have been included with the APS documentation for reference purposes (see 'Documentation' table below).
The APS aims to provide enhanced annual data for England, covering a target sample of at least 510 economically active persons for each Unitary Authority (UA)/Local Authority District (LAD) and at least 450 in each Greater London Borough. In combination with local LFS boost samples such as the WLFS and SLFS, the survey provides estimates for a range of indicators down to Local Education Authority (LEA) level across the United Kingdom.
APS Well-Being data
Since April 2011, the APS has included questions about personal and subjective well-being. The responses to these questions have been made available as annual sub-sets to the APS Person level files. It is important to note that the size of the achieved sample of the well-being questions within the dataset is approximately 165,000 people. This reduction is due to the well-being questions being only asked of persons aged 16 and above, who gave a personal interview and proxy answers are not accepted. As a result some caution should be used when using analysis of responses to well-being questions at detailed geography areas and also in relation to any other variables where respondent numbers are relatively small. It is recommended that for lower level geography analysis that the variable UACNTY09 is used.
As well as annual datasets, three-year pooled datasets are available. When combining multiple APS datasets together, it is important to account for the rotational design of the APS and ensure that no person appears more than once in the multiple year dataset. This is because the well-being datasets are not designed to be longitudinal e.g. they are not designed to track individuals over time/be used for longitudinal analysis. They are instead cross-sectional, and are designed to use a cross-section of the population to make inferences about the whole population. For this reason, the three-year dataset has been designed to include only a selection of the cases from the individual year APS datasets, chosen in such a way that no individuals are included more than once, and the cases included are approximately equally spread across the three years. Further information is available in the 'Documentation' section below.
Secure Access APS Well-Being data
Secure Access datasets for the APS Well-Being include additional variables not included in either the standard End User Licence (EUL) versions (see under GN 33357) or the Special Licence (SL) access versions (see under GN 33376). Extra variables that typically can be found in the Secure Access version but not in the EUL or SL versions relate to:
Description: The Educational Attainment Thematic Report is compiled using data from the Labour Force Survey (LFS). It is a household survey which replaced the Quarterly National Household Survey (QNHS) at the beginning of Q3 2017. The LFS is the official source of quarterly labour force estimates for Ireland including the official rates of employment and unemployment. Questions on educational attainment are included in the core LFS questionnaire each quarter. The Educational Attainment Thematic Report presents the LFS data for adults between 18 and 64 years old with differing levels of educational attainment based on these questions.This data provides a summary of the annual results of education attainment levels across the regional geographies in IrelandGeography available in RDM: State, Regional Assembly and Strategic Planning Area (SPA).Source: CSO Educational Attainment Thematic ReportWeblink: https://www.cso.ie/en/releasesandpublications/ep/p-eda/educationalattainmentthematicreport2021/Date of last source data update: February 2025Update Schedule: Annual
The main purpose of a Household Income and Expenditure Survey (HIES) was to present high quality and representative national household data on income and expenditure in order to update Consumer Price Index (CPI), improve statistics on National Accounts and measure poverty within the country.
The main objectives of this survey - update the weight of each expenditure item (from COICOP) and obtain weights for the revision of the Consumer Price Index (CPI) for Funafuti - provide data on the household sectors contribution to the National Accounts - design the structure of consumption for food secutiry - To provide information on the nature and distribution of household income, expenditure and food consumption patterns household living standard useful for planning purposes - To provide information on economic activity of men and women to study gender issues - To generate the income distribution for poverty analysis
The 2010 Household Income and Expenditure Survey (HIES) is the third HIES that was conducted by the Central Statistics Division since Tuvalu gained political independence in 1978.
This survey deals mostly with expenditure and income on the cash side and non cash side (gift, home production). Moreover, a lot of information are collected:
at a household level: - goods possession - description of the dwelling - water tank capacity - fruits and vegetables in the garden - livestock
at an individual level: - education level - employment - health
National Coverage: Funafuti and /Outer islands.
The scope of the 2010 Household Income and Expenditure Survey (HIES) was all occupied households in Tuvalu. Households are the sampling unit, defined as a group of people (related or not) who pool their money, and cook and eat together. It is not the physical structure (dwelling) in which people live. HIES covered all persons who were considered to be usual residents of private dwellings (must have been living in Tuvalu for a period of 12-months, or have intention to live in Tuvalu for a period of 12-months in order to be included in the survey). Usual residents who are temporary away are included as well (e.g., for work or a holiday).
All the private household are included in the sampling frame. In each household selected, the current resident are surveyed, and people who are usual resident but are currently away (work, health, holydays reasons, or border student for example. If the household had been residing in Tuvalu for less than one year: - but intend to reside more than 12 months => he is included - do not intend to reside more than 12 months => out of scope.
Sample survey data [ssd]
The Tuvalu 2010 Household Income and Expenditure Survey (HIES) outputs breakdowns at the domain level which is Funafuti and Outer Islands. To achieve this, and to match the budget constraint, a third of the households were selected in both domains. It was decided that 33% (one third) sample was sufficient to achieve suitable levels of accuracy for key estimates in the survey. So the sample selection was spread proportionally across all the islands except Niulakita as it was considered too small. The selection method used is the simple random survey, meaning that within each domain households were directly selected from the population frame (which was the updated 2009 household listing). All islands were included in the selection except Niulakita that was excluded due to its remoteness, and size.
For selection purposes, in the outer island domain, each island was treated as a separate strata and independent samples were selected from each (one third). The strategy used was to list each dwelling on the island by their geographical position and run a systematic skip through the list to achieve the 33% sample. This approach assured that the sample would be spread out across each island as much as possible and thus more representative.
Population and sample counts of dwellings by islands for 2010 HIES Islands: -Nanumea: Population: 123; sample: 41 -Nanumaga: Population: 117; sample: 39 -Niutao: Population: 138; sample: 46 -Nui: Population: 141; sample: 47 -Vaitupu: Population: 298; sample: 100 -Nukufetau: Population: 141; sample: 47 -Nukulaelae: Population: 78; sample: 26 -Funafuti: Population: 791; sample: 254 -TOTAL: Population: 1827; sample: 600.
Face-to-face [f2f]
3 forms were used. Each question is writen in English and translated in Tuvaluan on the same version of the questionnaire. The questionnaire was highly based on the previous one (2004 survey).
Household Schedule This questionnaire, to be completed by interviewers, is used to collect information about the household composition, living conditions and is also the main form for collecting expenditure on goods and services purchased infrequently.
Individual Schedule There will be two individual schedules: - health and education - labor force (individual aged 15 and above) - employment activity and income (individual aged 15 and above): wages and salaries working own business agriculture and livestock fishing income from handicraft income from gambling small scale activies jobs in the last 12 months other income childreen income tobacco and alcohol use other activities seafarer
Diary (one diary per week, on a 2 weeks period, 2 diaries per household were required) The diaries are used to record all household expenditure and consumption over the two week diary keeping period. The diaries are to be filled in by the household members, with the assistance from interviewers when necessary. - All kind of expenses - Home production - food and drink (eaten by the household, given away, sold) - Goods taken from own business (consumed, given away) - Monetary gift (given away, received, winning from gambling) - Non monetary gift (given away, received, winning from gambling).
Consistency of the data: - each questionnaire was checked by the supervisor during and after the collection - before data entry, all the questionnaire were coded - the CSPRo data entry system included inconsistency checks which allow the National Statistics Office staff to point some errors and to correct them with imputation estimation from their own knowledge (no time for double entry), 4 data entry operators. 1. presence of all the form for each household 2. consistency of data within the questionnaire
at this stage, all the errors were corrected on the questionnaire and on the data entry system in the meantime.
The final response rates for the survey was very pleasing with an average rate of 97 per cent across all islands selected. The response rates were derived by dividing the number of fully responding households by the number of selected households in scope of the survey which weren't vacant.
Response rates for Tuvalu 2010 Household Income and Expenditure Survey (HIES): - Nanumea 100% - Nanumaga 100% - Niutao 98% - Nui 100% - Vaitupu 99% - Nukufetau 89% - Nukulaelae 100% - Funafuti 96%
As can be seen in the table, four of the islands managed a 100 per cent response, whereas only Nukufetau had a response rate of less than 90 per cent.
Further explanation of response rates can be located in the external resource entitled Tuvalu 2010 HIES Report Table 1.2.
The quality of the results can be found in the report provided in this documentation.
This survey by the United Nations Educational, Scientific and Cultural Organization (UNESCO), the United Nations Children's Fund (UNICEF) and the World Bank seeks to collect information on national education responses to school closures related to the COVID-19 pandemic. The questionnaire is designed for Ministry of Education officials at central or decentralized level in charge of school education. The questionnaire does not cover higher education or technical and vocational education and training. Analysis of results will allow for policy learning across the diversity of country settings in order to better inform local/national responses and prepare for the reopening of schools. The survey will be run on a regular basis to ensure that the latest impact and responses are captured. In light of the current education crisis, the COVID-19 education response coordinated by UNESCO with our partners is deemed urgent. A first wave of data collection started in May and lasted until mid-June 2020. A second wave of data collection will start at the beginning of July. A link to the online survey questionnaire, as well as other formats, will be available shortly.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Vocational School Student Survey (VET Student Survey) 2015 is a total study charting the experiences of young people studying in Finnish vocational institutions. The survey was conducted collaboratively by the National Union of Vocational Students in Finland (SAKKI), the Research Foundation for Studies and Education (Otus), and the Ministry of Education and Culture. Main themes of the survey included current and former studies, experiences relating to studies, financial circumstances and housing, future plans, and well-being and leisure time. Relating to studies at the time of the survey, the respondents were asked, among others, their field of education, year of study, and distance to the vocational institution. Earlier studies and applying to vocational studies were charted by asking the respondents, for instance, whether vocational studies had been discussed in their families, whether their friends or siblings studied in a vocational institution, how clear the decision to opt for vocational studies had been, and whether they had worked or completed other studies before starting vocational studies. Experiences of studies and teaching were examined with questions about the time spent on studies in a week, teaching and guidance received, balancing and managing with studies, and money spent on study materials. Possible learning difficulties and support received for these difficulties were also surveyed. With regard to study progress, satisfaction with studies was charted as well as feelings of studying the right field, prospects of graduating, things slowing down study progress, and views on the importance of vocational studies. Working, housing and financial circumstances were investigated by asking about working during studies and in the summer, housing during the semesters, financial help from parents and relatives, and sufficiency of money for expenses. Concerning occupational life, opinions were probed on a number of statements about employment as well as employment prospects after graduation, and views on the importance of various things for a successful career. Future plans to study were surveyed. Finally, well-being and leisure time were examined with questions about friends, social relationships, bullying and discrimination, sleep, alcohol use, hobbies and Internet use. Background variables included the respondent's year of birth, gender, and mother tongue. The time the respondent had lived in Finland was further charted, along with languages spoken with parents, and parents' employment status and education level.
The Service Delivery Indicators (SDI) are a set of health and education indicators that examine the effort and ability of staff and the availability of key inputs and resources that contribute to a functioning school or health facility. The indicators are standardized allowing comparison between and within countries over time.
The Education SDIs include teacher effort, teacher knowledge and ability, and the availability of key inputs (for example, textbooks, basic teaching equipment, and infrastructure such as blackboards and toilets). The indicators provide a snapshot of the learning environment and the key resources necessary for students to learn.
Nigeria Service Delivery Indicators Education Survey was implemented in 2013 by the World Bank and the Research Triangle Institute International. The survey implementation was preceded by consultations with stakeholders in Nigeria to adapt instruments to the country context while maintaining comparability across countries. In addition, the implementation was done with close collaboration with the Universal Basic Education Commission, and in close coordination with the relevant state authorities (i.e. State Ministries of Education, and the State Universal Education Boards where they existed). Data was collected from primary schools in four states (Anambra, Bauchi, Ekiti, and Niger) using personal interviews and provider assessments. A total of 760 randomly selected public and private schools (190 per state) were surveyed, with 2,435 and 5,754 teachers assessed for knowledge and effort respectively. The sample was selected to make the survey representative at the State level, allowing for disaggregation by provider type (private/public) and location (rural/urban).
Four states: Anambra, Bauchi, Ekiti, and Niger.
Schools, teachers, students.
All primary schools.
Sample survey data [ssd]
The sampling strategy was designed aiming to produce state representative estimates and estimating a proportion with an absolute error of three percentage points for a variable proportion of 0.5 (i.e., has highest variance) with 95 percent degree of confidence per state (equal number used for state).
The strata were constructed according to ownership, urban/rural, and socioeconomic poverty status. The allocation was made in proportion to size for each sub-stratum within public and private. Within strata, simple random sampling was used. Finally, replacement schools were preselected, with a predetermined replacement order within strata.
A total of 190 schools were sampled from each of the four states (Anambra, Bauchi, Ekiti, and Niger).
The target population is all public primary-level school children. Since parts of the school questionnaire were administered to teachers and pupils at the grade four level, all public schools with at least one grade four class formed the sampling frame. The sample frame was created using the list of public schools from UBEC (Universal Basic Education Commission) and private schools from states.
None.
Face-to-face [f2f]
The SDI Education Survey Questionnaire consists of six modules:
Module 1: School Information - Administered to the head of the school to collect information on school type, facilities, school governance, pupil numbers, and school hours. It includes direct observations of school infrastructure by enumerators.
Module 2a: Teacher Absence and Information - Administered to the headteacher and individual teachers to obtain a list of all schoolteachers, to measure teacher absence and to collect information on teacher characteristics (this module was not included in this dataset).
Module 2b: Teacher Absence and Information - Unannounced visit to the school to assess absence rate.
Module 3: School Finances - Administered to the headteacher to collect information on school finances (this data is unharmonized).
Module 4: Classroom Observation - An observation module to assess teaching activities and classroom conditions.
Module 5: Pupil Assessment - A test of pupils to have a measure of pupil learning outcomes in mathematics and language in grade four.
Module 6: Teacher Assessment - A test of teachers covering mathematics and language subject knowledge and teaching skills.
Data entry was done using CSPro; quality control was performed in Stata.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Vocational School Student Survey (VET Student Survey) 2019 is a total study charting experiences of young people studying in Finnish vocational education institutions. The survey was conducted by the Research Foundation for Studies and Education (Otus) in collaboration with the National Union of Vocational Students in Finland (SAKKI), which also funded the study with funding received from the Ministry of Education and Culture and the Ministry of Economic Affairs and Employment. Main themes of the survey included applying for studies, experiences relating to studies and teaching, financial circumstances, plans for the future and working life, and wellbeing and leisure time. First, the respondents were asked about their studies at present with questions concerning, for instance, their field of education, how they financed their living costs during studies, and whether they were studying towards a dual or double degree. Questions also surveyed how the respondents had entered their studies (via the joint application system or continuous admission), whether their current field of education had been their first choice when applying, and whether they had began their studies straight away in the autumn after completing comprehensive school. The respondents' decision to apply for vocational studies was further examined with questions regarding, for instance, whether vocational studies had been discussed or recommended in their families or at school, whether their friends or siblings currently studied or had previously studied in a vocational institution, and how clear the decision to opt for vocational studies had been. The respondents were also asked whether they had worked or completed other studies before starting vocational studies, and how they had performed in earlier education. The respondents' experiences of studies and teaching were examined with questions about the time spent on studies in a week, form and sufficiency of the teaching and guidance received, balancing and managing studies, and the atmosphere of their school and study community. Further questions focused on the respondents' opinions on the personalisation of studies and competence-based studying, including, for instance, whether they thought they were able to influence what and how they studied and whether their career plans had been taken into account in their study plan. Opinions were also charted regarding on-the-job learning. Possible learning difficulties and support received for these difficulties were surveyed next. With regard to study progress, satisfaction with studies and the institution itself was charted as well as feelings of studying the right field, prospects of graduating, things slowing down study progress, and views on the importance of vocational studies. Working, housing and financial circumstances were investigated by asking about working during studies and in the summer, housing during the semesters, financial help from parents and relatives, and sufficiency of money for expenses. Concerning occupational life, opinions were probed on a number of statements about employment as well as employment prospects after graduation, and views on the importance of various things for a successful career. Future plans to study were surveyed. Well-being and leisure time were examined with questions about friends, social relationships, bullying and discrimination, sleep, hobbies, and Internet and social media use. Finally, the respondents' values and attitudes were examined with a set of statements including, for example, whether they thought income differences should be reduced, environmental protection should be the first priority, and Finland's EU membership was a good thing. Background variables included the respondent's year of birth, gender, and mother tongue. The time the respondent had lived in Finland was further charted, along with languages spoken with parents, and parents' employment status and education level.
The Distance Education Courses for Public Elementary and Secondary School Students, 2004-05 (FRSS 89), is a study that is part of the Fast Response Survey System (FRSS) program; program data is available since 1998-99 at . FRSS 89 (https://nces.ed.gov/surveys/frss/) is a sample survey that provides national estimates for technology-based distance education courses in public elementary and secondary schools. The study was mailed to public school district superintendents, who were sampled for the survey. The study's response rate was 96 percent. Key statistics produced from the study were the percent of districts and the percent of schools (by instructional level) with students enrolled in technology-based distance education courses. The number of enrollments in distance education courses (by instructional level) was also collected. The survey contained questions on the completion status of the enrollments in distance education. Districts were asked to report the technologies used to deliver distance education courses and where students accessed online distance education courses (e.g., at school or at home). The survey included questions on whether technology-based distance education was used to offer Advanced Placement (AP) and college-level courses to students. Districts with students enrolled in technology-based distance education courses were asked whether they planned to expand their distance education courses.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This repository includes the questionnaire and dataset collected for the paper "DEI in Computing Higher Education: Survey and Analysis of Brazilian and Finnish University Practices," which was submitted to ESEM 2025.
Paper Abstract:
Background: Efforts have been made in STEM, for example, to encourage women to pursue careers in computing or promote the importance of team diversity in the field. However, implementing Diversity and Inclusion (DEI) in university-level computing education remains underexplored.
Aims: This study compares the current state of DEI in Brazilian and Finnish universities.
Method: We replicated in Brazil an online survey conducted in Finland.
Results: We received 68 responses from Brazilian teachers. We compared the Brazilian and Finnish scenarios for incorporating DEI aspects in the classes.
Conclusions: While the importance of DEI in education is recognized, the implementation of DEI practices varies significantly across institutions and countries. Several challenges hinder this implementation, including a lack of teaching materials, insufficient training, limited institutional support, and concerns about addressing these topics appropriately. Regarding countries' differences, Brazilian professors rate DEI more important but report lower satisfaction than Finns, highlighting cultural and demographic factors affecting DEI practices.
Files Description:
Replicated Study:
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Open Science in (Higher) Education – data of the February 2017 survey
This data set contains:
Survey structure
The survey includes 24 questions and its structure can be separated in five major themes: material used in courses (5), OER awareness, usage and development (6), collaborative tools used in courses (2), assessment and participation options (5), demographics (4). The last two questions include an open text questions about general issues on the topics and singular open education experiences, and a request on forwarding the respondent’s e-mail address for further questionings. The online survey was created with Limesurvey[1]. Several questions include filters, i.e. these questions were only shown if a participants did choose a specific answer beforehand ([n/a] in Excel file, [.] In SPSS).
Demographic questions
Demographic questions asked about the current position, the discipline, birth year and gender. The classification of research disciplines was adapted to general disciplines at German higher education institutions. As we wanted to have a broad classification, we summarised several disciplines and came up with the following list, including the option “other” for respondents who do not feel confident with the proposed classification:
The current job position classification was also chosen according to common positions in Germany, including positions with a teaching responsibility at higher education institutions. Here, we also included the option “other” for respondents who do not feel confident with the proposed classification:
We chose to have a free text (numerical) for asking about a respondent’s year of birth because we did not want to pre-classify respondents’ age intervals. It leaves us options to have different analysis on answers and possible correlations to the respondents’ age. Asking about the country was left out as the survey was designed for academics in Germany.
Remark on OER question
Data from earlier surveys revealed that academics suffer confusion about the proper definition of OER[2]. Some seem to understand OER as free resources, or only refer to open source software (Allen & Seaman, 2016, p. 11). Allen and Seaman (2016) decided to give a broad explanation of OER, avoiding details to not tempt the participant to claim “aware”. Thus, there is a danger of having a bias when giving an explanation. We decided not to give an explanation, but keep this question simple. We assume that either someone knows about OER or not. If they had not heard of the term before, they do not probably use OER (at least not consciously) or create them.
Data collection
The target group of the survey was academics at German institutions of higher education, mainly universities and universities of applied sciences. To reach them we sent the survey to diverse institutional-intern and extern mailing lists and via personal contacts. Included lists were discipline-based lists, lists deriving from higher education and higher education didactic communities as well as lists from open science and OER communities. Additionally, personal e-mails were sent to presidents and contact persons from those communities, and Twitter was used to spread the survey.
The survey was online from Feb 6th to March 3rd 2017, e-mails were mainly sent at the beginning and around mid-term.
Data clearance
We got 360 responses, whereof Limesurvey counted 208 completes and 152 incompletes. Two responses were marked as incomplete, but after checking them turned out to be complete, and we added them to the complete responses dataset. Thus, this data set includes 210 complete responses. From those 150 incomplete responses, 58 respondents did not answer 1st question, 40 respondents discontinued after 1st question. Data shows a constant decline in response answers, we did not detect any striking survey question with a high dropout rate. We deleted incomplete responses and they are not in this data set.
Due to data privacy reasons, we deleted seven variables automatically assigned by Limesurvey: submitdate, lastpage, startlanguage, startdate, datestamp, ipaddr, refurl. We also deleted answers to question No 24 (email address).
References
Allen, E., & Seaman, J. (2016). Opening the Textbook: Educational Resources in U.S. Higher Education, 2015-16.
First results of the survey are presented in the poster:
Heck, Tamara, Blümel, Ina, Heller, Lambert, Mazarakis, Athanasios, Peters, Isabella, Scherp, Ansgar, & Weisel, Luzian. (2017). Survey: Open Science in Higher Education. Zenodo. http://doi.org/10.5281/zenodo.400561
Contact:
Open Science in (Higher) Education working group, see http://www.leibniz-science20.de/forschung/projekte/laufende-projekte/open-science-in-higher-education/.
[1] https://www.limesurvey.org
[2] The survey question about the awareness of OER gave a broad explanation, avoiding details to not tempt the participant to claim “aware”.