The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science. The data include journals that are open access, which was first defined by the Budapest Open Access Initiative: By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in English or abstracted in English, 3) actively published at the time of..., Data Collection In the spring of 2023, researchers gathered 377 scholarly journals whose content covered the work of librarians, archivists, and affiliated information professionals. This data encompassed 221 journals from the Proquest database Library and Information Science Abstracts (LISA), widely regarded as an authoritative database in the field of librarianship. From the Directory of Open Access Journals, we included 144 LIS journals. We also included 12 other journals not indexed in DOAJ or LISA, based on the researchers’ knowledge of existing OA library journals. The data is separated into several different sets representing the different indices and journals we searched. The first set includes journals from the database LISA. The following fields are in this dataset:
Journal: title of the journal
Publisher: title of the publishing company
Open Data Policy: lists whether an open data exists and what the policy is
Country of publication: country where the journal is publ..., , # Open access practices of selected library science journals
The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science.
The data include journals that are open access, which was first defined by the Budapest Open Access Initiative:Â
By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself.
Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in Engli...
Academic journals indicators developed from the information contained in the Scopus database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
This description is part of the blog post "Systematic Literature Review of teaching Open Science" https://sozmethode.hypotheses.org/839
According to my opinion, we do not pay enough attention to teaching Open Science in higher education. Therefore, I designed a seminar to teach students the practices of Open Science by doing qualitative research.About this seminar, I wrote the article ”Teaching Open Science and qualitative methods“. For the article ”Teaching Open Science and qualitative methods“, I started to review the literature on ”Teaching Open Science“. The result of my literature review is that certain aspects of Open Science are used for teaching. However, Open Science with all its aspects (Open Access, Open Data, Open Methodology, Open Science Evaluation and Open Science Tools) is not an issue in publications about teaching.
Based on this insight, I have started a systematic literature review. I realized quickly that I need help to analyse and interpret the articles and to evaluate my preliminary findings. Especially different disciplinary cultures of teaching different aspects of Open Science are challenging, as I myself, as a social scientist, do not have enough insight to be able to interpret the results correctly. Therefore, I would like to invite you to participate in this research project!
I am now looking for people who would like to join a collaborative process to further explore and write the systematic literature review on “Teaching Open Science“. Because I want to turn this project into a Massive Open Online Paper (MOOP). According to the 10 rules of Tennant et al (2019) on MOOPs, it is crucial to find a core group that is enthusiastic about the topic. Therefore, I am looking for people who are interested in creating the structure of the paper and writing the paper together with me. I am also looking for people who want to search for and review literature or evaluate the literature I have already found. Together with the interested persons I would then define, the rules for the project (cf. Tennant et al. 2019). So if you are interested to contribute to the further search for articles and / or to enhance the interpretation and writing of results, please get in touch. For everyone interested to contribute, the list of articles collected so far is freely accessible at Zotero: https://www.zotero.org/groups/2359061/teaching_open_science. The figure shown below provides a first overview of my ongoing work. I created the figure with the free software yEd and uploaded the file to zenodo, so everyone can download and work with it:
To make transparent what I have done so far, I will first introduce what a systematic literature review is. Secondly, I describe the decisions I made to start with the systematic literature review. Third, I present the preliminary results.
Systematic literature review – an Introduction
Systematic literature reviews “are a method of mapping out areas of uncertainty, and identifying where little or no relevant research has been done.” (Petticrew/Roberts 2008: 2). Fink defines the systematic literature review as a “systemic, explicit, and reproducible method for identifying, evaluating, and synthesizing the existing body of completed and recorded work produced by researchers, scholars, and practitioners.” (Fink 2019: 6). The aim of a systematic literature reviews is to surpass the subjectivity of a researchers’ search for literature. However, there can never be an objective selection of articles. This is because the researcher has for example already made a preselection by deciding about search strings, for example “Teaching Open Science”. In this respect, transparency is the core criteria for a high-quality review.
In order to achieve high quality and transparency, Fink (2019: 6-7) proposes the following seven steps:
I have adapted these steps for the “Teaching Open Science” systematic literature review. In the following, I will present the decisions I have made.
Systematic literature review – decisions I made
The dataset contains bibliographic information about scientific articles published by researchers from Norwegian research organizations and is an enhanced subset of data from the Cristin database. Cristin (current research information system in Norway) is a database with bibliographic records of all research articles with an Norwegian affiliation with a publicly funded research institution in Norway. The subset is limited to metadata about journal articles reported in the period 2013-2021 (186,621 records), and further limited to information of relevance for the study (see below). Article metadata are enhanced with open access status by several sources, particularly unpaywall, DOAJ and hybrid-information in case an article is part of a publish-and-read-deal.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Version: 5
Authors: Carlota Balsa-Sánchez, Vanesa Loureiro
Date of data collection: 2023/09/05
General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
File list:
- data_articles_journal_list_v5.xlsx: full list of 140 academic journals in which data papers or/and software papers could be published
- data_articles_journal_list_v5.csv: full list of 140 academic journals in which data papers or/and software papers could be published
Relationship between files: both files have the same information. Two different formats are offered to improve reuse
Type of version of the dataset: final processed version
Versions of the files: 5th version
- Information updated: number of journals, URL, document types associated to a specific journal.
Version: 4
Authors: Carlota Balsa-Sánchez, Vanesa Loureiro
Date of data collection: 2022/12/15
General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
File list:
- data_articles_journal_list_v4.xlsx: full list of 140 academic journals in which data papers or/and software papers could be published
- data_articles_journal_list_v4.csv: full list of 140 academic journals in which data papers or/and software papers could be published
Relationship between files: both files have the same information. Two different formats are offered to improve reuse
Type of version of the dataset: final processed version
Versions of the files: 4th version
- Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
- Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Journal Citation Reports (JCR) and/or Scimago Journal and Country Rank (SJR), Scopus and Web of Science (WOS), Journal Master List.
Version: 3
Authors: Carlota Balsa-Sánchez, Vanesa Loureiro
Date of data collection: 2022/10/28
General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
File list:
- data_articles_journal_list_v3.xlsx: full list of 124 academic journals in which data papers or/and software papers could be published
- data_articles_journal_list_3.csv: full list of 124 academic journals in which data papers or/and software papers could be published
Relationship between files: both files have the same information. Two different formats are offered to improve reuse
Type of version of the dataset: final processed version
Versions of the files: 3rd version
- Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
- Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Journal Citation Reports (JCR) and/or Scimago Journal and Country Rank (SJR).
Erratum - Data articles in journals Version 3:
Botanical Studies -- ISSN 1999-3110 -- JCR (JIF) Q2
Data -- ISSN 2306-5729 -- JCR (JIF) n/a
Data in Brief -- ISSN 2352-3409 -- JCR (JIF) n/a
Version: 2
Author: Francisco Rubio, Universitat Politècnia de València.
Date of data collection: 2020/06/23
General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
File list:
- data_articles_journal_list_v2.xlsx: full list of 56 academic journals in which data papers or/and software papers could be published
- data_articles_journal_list_v2.csv: full list of 56 academic journals in which data papers or/and software papers could be published
Relationship between files: both files have the same information. Two different formats are offered to improve reuse
Type of version of the dataset: final processed version
Versions of the files: 2nd version
- Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
- Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Scimago Journal and Country Rank (SJR)
Total size: 32 KB
Version 1: Description
This dataset contains a list of journals that publish data articles, code, software articles and database articles.
The search strategy in DOAJ and Ulrichsweb was the search for the word data in the title of the journals.
Acknowledgements:
Xaquín Lores Torres for his invaluable help in preparing this dataset.
https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html
The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science. The data include journals that are open access, which was first defined by the Budapest Open Access Initiative: By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in English or abstracted in English, 3) actively published at the time of analysis, and 4) scoped to librarianship. The dataset presents an overview of the landscape of open access scholarly publishing in the LIS field during a very specific time period, spring and summer of 2023. Methods Data Collection In the spring of 2023, researchers gathered 377 scholarly journals whose content covered the work of librarians, archivists, and affiliated information professionals. This data encompassed 221 journals from the Proquest database Library and Information Science Abstracts (LISA), widely regarded as an authoritative database in the field of librarianship. From the Directory of Open Access Journals, we included 144 LIS journals. We also included 12 other journals not indexed in DOAJ or LISA, based on the researchers’ knowledge of existing OA library journals. The data is separated into several different sets representing the different indices and journals we searched. The first set includes journals from the database LISA. The following fields are in this dataset:
Journal: title of the journal
Publisher: title of the publishing company
Open Data Policy: lists whether an open data exists and what the policy is
Country of publication: country where the journal is published
Open ranking: details whether the journal is diamond, gold, and/or green
Open peer review: specifies if the journal does open peer review
Author retains copyright: explains copyright policy
Charges: Details whether there is an article processing charge
In DOAJ: details whether the journal is also published in the Directory of Open Access Journals
The second set includes similar information, but it includes the titles of journals listed in the DOAJ.
Journal: states the title of the journal
Publisher: title of the publishing company
Country: country where the journal is published
Open Data Policy: lists whether an open data exists
Open Data Notes: Details about the open data policy
OA since: lists when the journal became open access
Open ranking: details whether the journal is diamond, gold, and/or green
Open peer review: specifies if the journal does open peer review
Author Holds Copyright without Restriction: lists
APC: Details whether there is an article processing charge
Type of CC: lists the Creative Commons license applied to the journal articles
In LISA: details whether the journal is also published in the Library and Information Science Abstracts database
A third dataset includes twelve scholarly, peer reviewed journals focused on Library and Information Science but not included in the DOAJ or LISA.
Journal: states the title of the journal
Publisher: title of the publishing company
Country: country where the journal is published
Open Data Policy: lists whether an open data exists
Open Data Notes: Details about the open data policy
Open ranking: details whether the journal is diamond, gold, and/or green
Open peer review: specifies if the journal does open peer review
Author Holds Copyright without Restriction: lists
APC: Details whether there is an article processing charge
Type of CC: lists the Creative Commons license applied to the journal articles
In LISA?: details whether the journal is also published in the Library and Information Science Abstracts database
Data Processing The researchers downloaded an Excel file from the publisher Proquest that listed the 221 journals included in LISA. From the DOAJ, the researchers searched and scoped to build an initial list. Thus, 144 journals were identified after limiting search results to English-language only journals and those whose scope fell under the following DOAJ search terms: librar* (to cover library, libraries, librarian, librarians, librarianship). Journals also needed to have been categorized within the DOAJ subject heading “Bibliography. Library science. Information resources. And for the journals that we analyzed that were in either index, those journals were included based on the researchers’ knowledge of current scholarly, peer-reviewed journals that would count toward tenure at their own university, an R1 university. Once the journals were identified, the researchers divided up the journals amongst each other and scoped them for the following criteria: 1) peer-reviewed 2) written in English or abstracted in English, 3) actively published at the time of analysis, and 4) scoped to librarianship. The end result was 134 journals that the researchers then explored on their individual websites to identify the following items: open data policies, open access publication options, country of origin, publisher, and peer review process. The researchers also looked for article processing costs, type of Creative Commons licensing (open licenses that allow users to redistribute and sometimes remix intellectual property), and whether the journals were included in either the DOAJ and/or LISA index. References: Budapest Open Access Initiative. (2002) http://www.soros.org/openaccess/
This file contains a list of journals used to assess publication productivity of the top 10 countries across medical specialties. For the 10 medical specialties, the journal category of the 2020 Scientific Journal Rankings (SJR) was used. These journals are listed in both and PubMed. Three types of journal lists are included: a) ALL dataset, b) 30H dataset, and c) 30P dataset. For the 10 medical specialties, the ALL dataset contains all journals, the 30H dataset contains 30 journals with the highest h-index scores, and the 30P dataset contains 30 journals with the highest number of published articles. For these journals, the actual bibliographic records could be downloaded from the NIH website (http://nlm.nih.gov/databases/download/pubmed_medline.html).
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Results of a large scale survey into how readers discover academic journal articles, books and video. The 2018 research, attracted just over 10,000 respondents from all over the world in all sectors, subjects and job roles. The authors have been researching in this area since 2005. Full Report to be published on the 3rd September.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Background for the study Systematic reviews are an important type of evidence to inform clinical practice guidelines in health care. They synthesise all available primary research to provide a more reliable estimate of effectiveness and risk factors among others. Next to the transparent reporting of each phase of the review, the literature search to retrieve all evidence is a crucial component of a systematic review. When the search is of poor quality, the process might not identify all available data for analysis. The following phases such as screening, data extraction, assessing study quality and synthesising data depend on identifying relevant studies. As a result of a poorly executed search, the systematic review might be biased, might lack information and might misinform its users. There are several guiding documents to aid researchers to make systematic reviews of good quality, such as methodological handbooks for conducting the research, reporting guidelines for writing a systematic review and checklists for appraising the quality of systematic reviews. Reporting guidelines There are several checklists for evaluating the methodological quality of systematic reviews, such as Risk of Bias Assessment Tool for Systematic Reviews (ROBIS), A MeaSurement Tool to Assess systematic Reviews 2 (AMSTAR2) and Critical Appraisal Skills Programme (CASP) Checklists. These consist of several items for assessing the methodological aspects of a systematic review, including the literature search. Even though there are plenty of resources on how to conduct, report or assess systematic reviews, papers with poor quality searches are still being published. The low quality relates to both the execution (Faggion et al., 2013; Franco et al., 2018; Koffel & Rethlefsen, 2016; Mullins et al., 2014; Opheim et al., 2019; Salvador-Olivan et al., 2019; Sampson & McGowan, 2006; Yoshii et al., 2009) and reporting of the search (Faggion et al., 2013; Faggion et al., 2018; Koffel & Rethlefsen, 2016; Mullins et al., 2014). Other guiding documents document searches in general, such as publishing organisations’ ethical guidelines, e.g. the International Committee of Medical Journal Editors (ICMJE) and the Committee on Publication Ethics (COPE) guidelines. These committees encourage journals to instruct authors to follow established guidelines for research, and to state how research data is located, selected and analysed. Furthermore, research methods should be described so that it is possible to reproduce the results (International Committee of Medical Journal Editors, 2022). In addition to the general reporting guidelines, researchers should follow the author instructions as provided by the journals. There are several studies investigating the uptake of reporting guidelines (Page & Moher, 2017) and the expectations to report statistics in journals (Blann & Nation, 2009; Giofre et al., 2017). There are two surveys investigating the role of author instructions on reporting of systematic searches. Biocic et al. (2019) investigated, among other things, the systematic review search methods requirements listed in the author instructions from 26 journals. 46% of the journals mentioned reporting guidelines, and 19% gave additional instructions on reporting of search methods. Unfortunately, the details of these instructions have not been published. Goldberg et al. (2022) assessed author instructions in a sample of publications from one US university. Their sample consisted of 145 unique journals, of which they found 60% to be addressing reporting guidelines. Only 9% of author instructions mentioned searching more than two databases and 5% recommended working with a librarian when doing a systematic review. As the sample of journals was selected based on publications from only one university, the results may lack transferability to a more general medical and health context.
In this study, we want to further explore the role of academic journals – in the field of medicine and health – in guiding authors to report systematic review searches. We will do this by reviewing the information in the author instructions from a bigger and wider sample than the two studies previously mentioned. We also want to analyse in more detail the instructions on the information retrieval process, which is different to the research from Goldberg et al. (2022) and Biocic et al. (2019).
Aim of the study The aim of this study is to map to what degree the author instructions of medical and health related journals encourage reporting of systematic review searches. We aim to answer the following research questions: 1. Do journals’ author instructions include a section on reporting literature searches for systematic reviews? Are these instructions based on internationally accepted guidelines (e.g. Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) statement or Methodological Expectations of Cochrane Intervention Reviews (MECIR) standards) or in-house guidelines? Is it recommended or required to follow the guidelines when submitting a manuscript? 2. Do the author instructions include a procedure used by journal editors to verify if the prescribed reporting guidelines were followed (e.g. submitting a completed PRISMA checklist upon submission of the manuscript)? 3. Do the author instructions recommend involving an information specialist or medical librarian when creating the search strategy, or do they mention consulting one during peer review of systematic reviews? 4. Do the journals’ author instructions describe a procedure for registering a systematic review protocol? If so, is it recommended or mandatory to register a protocol?
Academic Search Complete is the world's most valuable and comprehensive scholarly, multi-disciplinary full-text database, with more than 8,500 full-text periodicals, including more than 7,300 peer-reviewed journals.
This dataset was compiled as part of a study on Barriers and Opportunities in the Discoverability and Indexing of Student-led Academic Journals. The list of student journals and their details is compiled from public sources. This list is used to identify the presence of Canadian student journals in Google Scholar as well as in select indexes and databases: DOAJ, Scopus, Web of Science, Medline, Erudit, ProQuest, and HeinOnline. Additionally, journal publishing platform is recorded to be used for a correlational analysis against Google Scholar indexing results. For further details see README.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Dataset from a study aiming to analyze the coverage of Latin American and Caribbean journals in Google Scholar Metrics (GSM). Data from 8,205 journals from 24 countries of the region were downloaded from Latindex database. A Python script was used for automated title search and data extraction (titles, h5-index, h5-median, URLs) in GSM. For the journals not found, a manual search was carried out, with attempts by variations of the title. It was found 3,070 journals indexed in GSM, which corresponds to 37.42% of the Latindex list. The search was performed on the 2021 edition of GSM, which considers articles published between 2016 and 2020 and citations registered until July 2021. The number of all types of documents published (productivity) in the h5-index period (2016-2020) in Scopus, Journal Citation Reports, and SciELO of 1,314 journals was also identified.
The present dataset is the result of this study, which is under peer-review in a scientific journal.
The dataset comprises titles, h5-index; h5-median, URLs of 3,070 publications from Latin America and the Caribbean identified in Google Scholar Metrics, and the respective editorial information of the publications was extracted from Latindex
The original language of the content was kept, mainly Spanish in the case of editorial data from Latindex. The columns descriptors are also shown in English.
The productivity data refer to the number of all types of documents published by the journals in the period 2016-2020. Data were extracted from the InCities Journal Citation Reports, Scopus, and SciELO Citation Index (Web of Science database).
In this version 2, only the productivity data were changed, covering a larger number of journals (1,314) and including all types of documents. Other data are the same as in the first version (https://doi.org/10.5281/zenodo.5572873).
https://www.gesis.org/en/institute/data-usage-termshttps://www.gesis.org/en/institute/data-usage-terms
Data sharing is key for replication and re-use in empirical research. Scientific journals can play a central role by establishing data policies and providing technologies. In this study factors of influence for data sharing are analyzed by investigating journal data policies and author behavior in sociology. The websites of 140 journals from sociology were consulted to check their data policy. The results are compared with similar studies from political science and economics. For five selected journals with a broad variety all articles from two years are examined to see if authors really cite and share their data, and which factors are related to this.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Search terms used and journals searched.
This data corresponds to the article "Write Up! A Studiy of Copyright Information on Library-Published Journals," and covers the copyright and licensing policies of library-published scholarly journals.
CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This data package contains the results of a large-scale analysis of author guidelines from several publishers and journals active in chemistry research, showing how well the publishing landscape supports different criteria.
The journals’ author guidelines and/or editorial policies were examined on whether they take a stance with regard to the availability of the underlying data of the submitted article. The mere explicated possibility of providing supplementary material along with the submitted article was not considered as a research data policy in the present study. Furthermore, the present article excluded source codes or algorithms from the scope of the paper and thus policies related to them are not included in the analysis of the present article.
For selection of journals within the field of neurosciences, Clarivate Analytics’ InCites Journal Citation Reports database was searched using categories of neurosciences and neuroimaging. From the results, journals with the 40 highest Impact Factor (for the year 2017) indicators were extracted for scrutiny of research data policies. Respectively, the selection journals within the field of physics was created by performing a similar search with the categories of physics, applied; physics, atomic, molecular & chemical; physics, condensed matter; physics, fluids & plasmas; physics, mathematical; physics, multidisciplinary; physics, nuclear and physics, particles & fields. From the results, journals with the 40 highest Impact Factor indicators were again extracted for scrutiny. Similarly, the 40 journals representing the field of operations research were extracted by using the search category of operations research and management.
Journal-specific data policies were sought from journal specific websites providing journal specific author guidelines or editorial policies. Within the present study, the examination of journal data policies was done in May 2019. The primary data source was journal-specific author guidelines. If journal guidelines explicitly linked to the publisher’s general policy with regard to research data, these were used in the analyses of the present article. If journal-specific research data policy, or lack of, was inconsistent with the publisher’s general policies, the journal-specific policies and guidelines were prioritized and used in the present article’s data. If journals’ author guidelines were not openly available online due to, e.g., accepting submissions on an invite-only basis, the journal was not included in the data of the present article. Also journals that exclusively publish review articles were excluded and replaced with the journal having the next highest Impact Factor indicator so that each set representing the three field of sciences consisted of 40 journals. The final data thus consisted of 120 journals in total.
‘Public deposition’ refers to a scenario where researcher deposits data to a public repository and thus gives the administrative role of the data to the receiving repository. ‘Scientific sharing’ refers to a scenario where researcher administers his or her data locally and by request provides it to interested reader. Note that none of the journals examined in the present article required that all data types underlying a submitted work should be deposited into a public data repositories. However, some journals required public deposition of data of specific types. Within the journal research data policies examined in the present article, these data types are well presented by the Springer Nature policy on “Availability of data, materials, code and protocols” (Springer Nature, 2018), that is, DNA and RNA data; protein sequences and DNA and RNA sequencing data; genetic polymorphisms data; linked phenotype and genotype data; gene expression microarray data; proteomics data; macromolecular structures and crystallographic data for small molecules. Furthermore, the registration of clinical trials in a public repository was also considered as a data type in this study. The term specific data types used in the custom coding framework of the present study thus refers to both life sciences data and public registration of clinical trials. These data types have community-endorsed public repositories where deposition was most often mandated within the journals’ research data policies.
The term ‘location’ refers to whether the journal’s data policy provides suggestions or requirements for the repositories or services used to share the underlying data of the submitted works. A mere general reference to ‘public repositories’ was not considered a location suggestion, but only references to individual repositories and services. The category of ‘immediate release of data’ examines whether the journals’ research data policy addresses the timing of publication of the underlying data of submitted works. Note that even though the journals may only encourage public deposition of the data, the editorial processes could be set up so that it leads to either publication of the research data or the research data metadata in conjunction to publishing of the submitted work.
Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Delphi method is an iterative, anonymous, group-based process for eliciting and aggregating opinion on a topic to explore the existence of consensus among experts. The year 2023 marks the 60th anniversary of the first peer-reviewed journal article on the Delphi method. Originally developed for operations research, this method is now applied extensively by researchers representing diverse scientific fields. We used a bibliometric analysis to describe general trends in the expansion of its use across disciplines over time. We conducted a systematic literature search for all English-language, peer-reviewed journal articles on the Delphi method through its first 60 years. We found 19,831 articles: 96.8% (n = 19,204) on the actual use of the Delphi method in an empirical study and 3.2% (n = 627) describing, examining, or providing some guidance on how to use the Delphi method. Almost half (49.9%) of all articles were published in the 2010s and an additional third (32.5%) in the first few years of the 2020s. Nearly two-thirds (65%, n = 12,883) of all published articles have appeared in medical journals, compared to 15% in science and technology (n = 3,053) or social science (n = 3,016) journals. We conclude that the expanded use of the Delphi method has been driven largely by the medical field, though social scientists and technologists continue to be at the forefront of methodological work on the Delphi method. Therefore, we call for greater transdisciplinary collaboration on methodological guidance and standards for the Delphi method.
https://data.gov.tw/licensehttps://data.gov.tw/license
The list of papers published by the research institute in each year at domestic and international conferences and publications includes the titles of the papers, the names of the conferences or publications, the publication dates, and the publishing units, making it convenient to search for papers of interest. For further information, you can directly browse the publications or contact relevant personnel of the institute.
Journal of Chemistry Abstract & Indexing - ResearchHelpDesk - Journal of Chemistry is a peer-reviewed, Open Access journal that publishes original research articles as well as review articles on all aspects of fundamental and applied chemistry. Journal of Chemistry is archived in Portico, which provides permanent archiving for electronic scholarly journals, as well as via the LOCKSS initiative. It operates a fully open access publishing model which allows open global access to its published content. This model is supported through Article Processing Charges. Journal of Chemistry is included in many leading abstracting and indexing databases. For a complete list, click here. The most recent Impact Factor for Journal of Chemistry is 1.727 according to the 2018 Journal Citation Reports released by Clarivate Analytics in 2019. The journal’s most recent CiteScore is 1.32 according to the CiteScore 2018 metrics released by Scopus. Abstracting and Indexing Academic Search Alumni Edition Academic Search Complete AgBiotech Net AgBiotech News and Information AGRICOLA Agricultural Economics Database Agricultural Engineering Abstracts Agroforestry Abstracts Animal Breeding Abstracts Animal Science Database Biofuels Abstracts Botanical Pesticides CAB Abstracts Chemical Abstracts Service (CAS) CNKI Scholar Crop Physiology Abstracts Crop Science Database Directory of Open Access Journals (DOAJ) EBSCOhost Connection EBSCOhost Research Databases Elsevier BIOBASE - Current Awareness in Biological Sciences (CABS) EMBIOlogy Energy and Power Source Global Health Google Scholar J-Gate Portal Journal Citation Reports - Science Edition Open Access Journals Integrated Service System Project (GoOA) Primo Central Index Reaxys Science Citation Index Expanded Scopus Textile Technology Index The Summon Service WorldCat Discovery Services
The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science. The data include journals that are open access, which was first defined by the Budapest Open Access Initiative: By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in English or abstracted in English, 3) actively published at the time of..., Data Collection In the spring of 2023, researchers gathered 377 scholarly journals whose content covered the work of librarians, archivists, and affiliated information professionals. This data encompassed 221 journals from the Proquest database Library and Information Science Abstracts (LISA), widely regarded as an authoritative database in the field of librarianship. From the Directory of Open Access Journals, we included 144 LIS journals. We also included 12 other journals not indexed in DOAJ or LISA, based on the researchers’ knowledge of existing OA library journals. The data is separated into several different sets representing the different indices and journals we searched. The first set includes journals from the database LISA. The following fields are in this dataset:
Journal: title of the journal
Publisher: title of the publishing company
Open Data Policy: lists whether an open data exists and what the policy is
Country of publication: country where the journal is publ..., , # Open access practices of selected library science journals
The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science.
The data include journals that are open access, which was first defined by the Budapest Open Access Initiative:Â
By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself.
Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in Engli...