100+ datasets found
  1. Data from: Journal Ranking Dataset

    • kaggle.com
    Updated Aug 15, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Abir (2023). Journal Ranking Dataset [Dataset]. https://www.kaggle.com/datasets/xabirhasan/journal-ranking-dataset
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Aug 15, 2023
    Dataset provided by
    Kaggle
    Authors
    Abir
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    Journals & Ranking

    An academic journal or research journal is a periodical publication in which research articles relating to a particular academic discipline is published, according to Wikipedia. Currently, there are more than 25,000 peer-reviewed journals that are indexed in citation index databases such as Scopus and Web of Science. These indexes are ranked on the basis of various metrics such as CiteScore, H-index, etc. The metrics are calculated from yearly citation data of the journal. A lot of efforts are given to make a metric that reflects the journal's quality.

    Journal Ranking Dataset

    This is a comprehensive dataset on the academic journals coving their metadata information as well as citation, metrics, and ranking information. Detailed data on their subject area is also given in this dataset. The dataset is collected from the following indexing databases: - Scimago Journal Ranking - Scopus - Web of Science Master Journal List

    The data is collected by scraping and then it was cleaned, details of which can be found in HERE.

    Key Features

    • Rank: Overall rank of journal (derived from sorted SJR index).
    • Title: Name or title of journal.
    • OA: Open Access or not.
    • Country: Country of origin.
    • SJR-index: A citation index calculated by Scimago.
    • CiteScore: A citation index calculated by Scopus.
    • H-index: Hirsh index, the largest number h such that at least h articles in that journal were cited at least h times each.
    • Best Quartile: Top Q-index or quartile a journal has in any subject area.
    • Best Categories: Subject areas with top quartile.
    • Best Subject Area: Highest ranking subject area.
    • Best Subject Rank: Rank of the highest ranking subject area.
    • Total Docs.: Total number of documents of the journal.
    • Total Docs. 3y: Total number of documents in the past 3 years.
    • Total Refs.: Total number of references of the journal.
    • Total Cites 3y: Total number of citations in the past 3 years.
    • Citable Docs. 3y: Total number of citable documents in the past 3 years.
    • Cites/Doc. 2y: Total number of citations divided by the total number of documents in the past 2 years.
    • Refs./Doc.: Total number of references divided by the total number of documents.
    • Publisher: Name of the publisher company of the journal.
    • Core Collection: Web of Science core collection name.
    • Coverage: Starting year of coverage.
    • Active: Active or inactive.
    • In-Press: Articles in press or not.
    • ISO Language Code: Three-letter ISO 639 code for language.
    • ASJC Codes: All Science Journal Classification codes for the journal.

    Rest of the features provide further details on the journal's subject area or category: - Life Sciences: Top level subject area. - Social Sciences: Top level subject area. - Physical Sciences: Top level subject area. - Health Sciences: Top level subject area. - 1000 General: ASJC main category. - 1100 Agricultural and Biological Sciences: ASJC main category. - 1200 Arts and Humanities: ASJC main category. - 1300 Biochemistry, Genetics and Molecular Biology: ASJC main category. - 1400 Business, Management and Accounting: ASJC main category. - 1500 Chemical Engineering: ASJC main category. - 1600 Chemistry: ASJC main category. - 1700 Computer Science: ASJC main category. - 1800 Decision Sciences: ASJC main category. - 1900 Earth and Planetary Sciences: ASJC main category. - 2000 Economics, Econometrics and Finance: ASJC main category. - 2100 Energy: ASJC main category. - 2200 Engineering: ASJC main category. - 2300 Environmental Science: ASJC main category. - 2400 Immunology and Microbiology: ASJC main category. - 2500 Materials Science: ASJC main category. - 2600 Mathematics: ASJC main category. - 2700 Medicine: ASJC main category. - 2800 Neuroscience: ASJC main category. - 2900 Nursing: ASJC main category. - 3000 Pharmacology, Toxicology and Pharmaceutics: ASJC main category. - 3100 Physics and Astronomy: ASJC main category. - 3200 Psychology: ASJC main category. - 3300 Social Sciences: ASJC main category. - 3400 Veterinary: ASJC main category. - 3500 Dentistry: ASJC main category. - 3600 Health Professions: ASJC main category.

  2. Data sharing policies in scholarly journals across 22 disciplines [dataset]

    • figshare.com
    xlsx
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ui Ikeuchi (2023). Data sharing policies in scholarly journals across 22 disciplines [dataset] [Dataset]. http://doi.org/10.6084/m9.figshare.3144991.v2
    Explore at:
    xlsxAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Ui Ikeuchi
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Survey period: 08 April - 08 May, 2014 Top 10 Impact Factor journals in each of 22 categories

    Figures https://doi.org/10.6084/m9.figshare.6857273.v1

    Article https://doi.org/10.20651/jslis.62.1_20 https://doi.org/10.15068/00158168

  3. f

    Using social media to promote academic research: Identifying the benefits of...

    • plos.figshare.com
    docx
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Samara Klar; Yanna Krupnikov; John Barry Ryan; Kathleen Searles; Yotam Shmargad (2023). Using social media to promote academic research: Identifying the benefits of twitter for sharing academic work [Dataset]. http://doi.org/10.1371/journal.pone.0229446
    Explore at:
    docxAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    PLOS ONE
    Authors
    Samara Klar; Yanna Krupnikov; John Barry Ryan; Kathleen Searles; Yotam Shmargad
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    To disseminate research, scholars once relied on university media services or journal press releases, but today any academic can turn to Twitter to share their published work with a broader audience. The possibility that scholars can push their research out, rather than hope that it is pulled in, holds the potential for scholars to draw wide attention to their research. In this manuscript, we examine whether there are systematic differences in the types of scholars who most benefit from this push model. Specifically, we investigate the extent to which there are gender differences in the dissemination of research via Twitter. We carry out our analyses by tracking tweet patterns for articles published in six journals across two fields (political science and communication), and we pair this Twitter data with demographic and educational data about the authors of the published articles, as well as article citation rates. We find considerable evidence that, overall, article citations are positively correlated with tweets about the article, and we find little evidence to suggest that author gender affects the transmission of research in this new media.

  4. Higher Education Research: A Compilation of Journals and Abstracts 2019

    • zenodo.org
    • data.niaid.nih.gov
    bin
    Updated Nov 5, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alexandra Hertwig; Alexandra Hertwig (2020). Higher Education Research: A Compilation of Journals and Abstracts 2019 [Dataset]. http://doi.org/10.5281/zenodo.4244370
    Explore at:
    binAvailable download formats
    Dataset updated
    Nov 5, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Alexandra Hertwig; Alexandra Hertwig
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Dataset to original publication:

    Hertwig, Alexandra (2020): Higher Education Research. A Compilation of Journals and Abstracts 2019. Kassel: INCHER-Kassel. DOI: 10.17170/kobra-202010292027.

    The Research Information Service (RIS) of INCHER-Kassel, Germany provides annual compilation of academic journals since 2013. This useful information tool for researchers also provides as a “side effect” an overview of the current topics of higher education research. The datasets allow for further evaluation of single or multiple volumes. For more information on original publications and available datasets please visit INCHER’s RIS websites.

    http://www.uni-kassel.de/einrichtungen/en/incher/risspecial-research-library/ris-documents.html

  5. r

    Journal Of Management Research And Analysis Impact Factor 2024-2025 -...

    • researchhelpdesk.org
    Updated Feb 23, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). Journal Of Management Research And Analysis Impact Factor 2024-2025 - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/impact-factor-if/51/journal-of-management-research-and-analysis
    Explore at:
    Dataset updated
    Feb 23, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    Journal Of Management Research And Analysis Impact Factor 2024-2025 - ResearchHelpDesk - Journal of Management Research and Analysis is a Double-Blind Peer Review journal that provides a specialized academic medium and important reference for the encouragement and dissemination of research and practice in management research. JMRA carries theoretical and empirical papers, case studies, research notes, executive experience sharing, and review articles, and it aims at disseminating new knowledge in the field of different domain areas of management, information technology, and related disciplines. It provides a forum for deliberations and exchange of knowledge among academics, industries, researchers, planners and the practitioners who are concerned with the management, financial institutions, public and private organizations, as well as voluntary organizations. Our editorial policy is that the journal serves the profession by publishing significant new scholarly research in management discipline areas that are of the highest quality. Aim & Scope: Journal of Management Research and Analysis (JMRA) is a quarterly, international, refereed journal published with the aim to provide an online publishing platform for the academia, management researchers, and management students to publish their original works. It aims at getting together intellectuals with the dissemination of original research, new ideas and innovations and practical experience in the concerned fields on a common platform. It also aims at understanding, advancing and promoting the emerging global trends in learning and knowledge assimilation of management researches and imparting the same to the benefit of Industry and academia for further improvisation of education systems at national as well as global level and to evolve the participation of student fraternity in the on-going discussion on socially desirable economic, commerce and management issues. JMRA focuses on publishing scholarly articles from the areas of management, management principles, recent inventions in management, company management, financial management, human resources, accounting, marketing, management control systems, supply chain management, operations management, human resource management, economics, commerce, statistics, international business, information technology, environment, risk management, import-export management, logistics management, hospitality management, health and hospital management, globalization and related areas. Journal of Management Research and Analysis seeks original manuscripts that identify, extend, unify, test or apply scientific and multi-disciplinary knowledge concerned to the management field. The following types of papers are considered for publication: 1. Original research works in the above-mentioned fields. 2. Surveys, opinions, abstracts, and essays related to Operations research. 3. Few review papers will be published if the author had done considerable work in that area. 4. Case studies related to the management domain. Indexing Information: Index Copernicus, Google Scholar, UGC, Crossref etc.

  6. I

    Undergraduate Research Journal Data, 2014-2015

    • databank.illinois.edu
    Updated Jul 23, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Merinda Kaye Hensley; Heidi R. Johnson (2018). Undergraduate Research Journal Data, 2014-2015 [Dataset]. http://doi.org/10.13012/B2IDB-5348256_V1
    Explore at:
    Dataset updated
    Jul 23, 2018
    Authors
    Merinda Kaye Hensley; Heidi R. Johnson
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Dataset funded by
    University of Illinois at Urbana-Champaign, University Library, Research and Publication Committee
    Description

    Qualitative Data collected from the websites of undergraduate research journals between October, 2014 and May, 2015. Two CSV files. The first file, "Sample", includes the sample of journals with secondary data collected. The second file, "Population", includes the remainder of the population for which secondary data was not collected. Note: That does not add up to 800 as indicated in article, rows were deleted for journals that had broken links or defunct websites during random sampling process.

  7. Map of articles about "Teaching Open Science"

    • zenodo.org
    • data.niaid.nih.gov
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Isabel Steinhardt; Isabel Steinhardt (2020). Map of articles about "Teaching Open Science" [Dataset]. http://doi.org/10.5281/zenodo.3371415
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Isabel Steinhardt; Isabel Steinhardt
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This description is part of the blog post "Systematic Literature Review of teaching Open Science" https://sozmethode.hypotheses.org/839

    According to my opinion, we do not pay enough attention to teaching Open Science in higher education. Therefore, I designed a seminar to teach students the practices of Open Science by doing qualitative research.About this seminar, I wrote the article ”Teaching Open Science and qualitative methods“. For the article ”Teaching Open Science and qualitative methods“, I started to review the literature on ”Teaching Open Science“. The result of my literature review is that certain aspects of Open Science are used for teaching. However, Open Science with all its aspects (Open Access, Open Data, Open Methodology, Open Science Evaluation and Open Science Tools) is not an issue in publications about teaching.

    Based on this insight, I have started a systematic literature review. I realized quickly that I need help to analyse and interpret the articles and to evaluate my preliminary findings. Especially different disciplinary cultures of teaching different aspects of Open Science are challenging, as I myself, as a social scientist, do not have enough insight to be able to interpret the results correctly. Therefore, I would like to invite you to participate in this research project!

    I am now looking for people who would like to join a collaborative process to further explore and write the systematic literature review on “Teaching Open Science“. Because I want to turn this project into a Massive Open Online Paper (MOOP). According to the 10 rules of Tennant et al (2019) on MOOPs, it is crucial to find a core group that is enthusiastic about the topic. Therefore, I am looking for people who are interested in creating the structure of the paper and writing the paper together with me. I am also looking for people who want to search for and review literature or evaluate the literature I have already found. Together with the interested persons I would then define, the rules for the project (cf. Tennant et al. 2019). So if you are interested to contribute to the further search for articles and / or to enhance the interpretation and writing of results, please get in touch. For everyone interested to contribute, the list of articles collected so far is freely accessible at Zotero: https://www.zotero.org/groups/2359061/teaching_open_science. The figure shown below provides a first overview of my ongoing work. I created the figure with the free software yEd and uploaded the file to zenodo, so everyone can download and work with it:

    To make transparent what I have done so far, I will first introduce what a systematic literature review is. Secondly, I describe the decisions I made to start with the systematic literature review. Third, I present the preliminary results.

    Systematic literature review – an Introduction

    Systematic literature reviews “are a method of mapping out areas of uncertainty, and identifying where little or no relevant research has been done.” (Petticrew/Roberts 2008: 2). Fink defines the systematic literature review as a “systemic, explicit, and reproducible method for identifying, evaluating, and synthesizing the existing body of completed and recorded work produced by researchers, scholars, and practitioners.” (Fink 2019: 6). The aim of a systematic literature reviews is to surpass the subjectivity of a researchers’ search for literature. However, there can never be an objective selection of articles. This is because the researcher has for example already made a preselection by deciding about search strings, for example “Teaching Open Science”. In this respect, transparency is the core criteria for a high-quality review.

    In order to achieve high quality and transparency, Fink (2019: 6-7) proposes the following seven steps:

    1. Selecting a research question.
    2. Selecting the bibliographic database.
    3. Choosing the search terms.
    4. Applying practical screening criteria.
    5. Applying methodological screening criteria.
    6. Doing the review.
    7. Synthesizing the results.

    I have adapted these steps for the “Teaching Open Science” systematic literature review. In the following, I will present the decisions I have made.

    Systematic literature review – decisions I made

    1. Research question: I am interested in the following research questions: How is Open Science taught in higher education? Is Open Science taught in its full range with all aspects like Open Access, Open Data, Open Methodology, Open Science Evaluation and Open Science Tools? Which aspects are taught? Are there disciplinary differences as to which aspects are taught and, if so, why are there such differences?
    2. Databases: I started my search at the Directory of Open Science (DOAJ). “DOAJ is a community-curated online directory that indexes and provides access to high quality, open access, peer-reviewed journals.” (https://doaj.org/) Secondly, I used the Bielefeld Academic Search Engine (base). Base is operated by Bielefeld University Library and “one of the world’s most voluminous search engines especially for academic web resources” (base-search.net). Both platforms are non-commercial and focus on Open Access publications and thus differ from the commercial publication databases, such as Web of Science and Scopus. For this project, I deliberately decided against commercial providers and the restriction of search in indexed journals. Thus, because my explicit aim was to find articles that are open in the context of Open Science.
    3. Search terms: To identify articles about teaching Open Science I used the following search strings: “teaching open science” OR teaching “open science” OR teach „open science“. The topic search looked for the search strings in title, abstract and keywords of articles. Since these are very narrow search terms, I decided to broaden the method. I searched in the reference lists of all articles that appear from this search for further relevant literature. Using Google Scholar I checked which other authors cited the articles in the sample. If the so checked articles met my methodological criteria, I included them in the sample and looked through the reference lists and citations at Google Scholar. This process has not yet been completed.
    4. Practical screening criteria: I have included English and German articles in the sample, as I speak these languages (articles in other languages are very welcome, if there are people who can interpret them!). In the sample only journal articles, articles in edited volumes, working papers and conference papers from proceedings were included. I checked whether the journals were predatory journals – such articles were not included. I did not include blogposts, books or articles from newspapers. I only included articles that fulltexts are accessible via my institution (University of Kassel). As a result, recently published articles at Elsevier could not be included because of the special situation in Germany regarding the Project DEAL (https://www.projekt-deal.de/about-deal/). For articles that are not freely accessible, I have checked whether there is an accessible version in a repository or whether preprint is available. If this was not the case, the article was not included. I started the analysis in May 2019.
    5. Methodological criteria: The method described above to check the reference lists has the problem of subjectivity. Therefore, I hope that other people will be interested in this project and evaluate my decisions. I have used the following criteria as the basis for my decisions: First, the articles must focus on teaching. For example, this means that articles must describe how a course was designed and carried out. Second, at least one aspect of Open Science has to be addressed. The aspects can be very diverse (FOSS, repositories, wiki, data management, etc.) but have to comply with the principles of openness. This means, for example, I included an article when it deals with the use of FOSS in class and addresses the aspects of openness of FOSS. I did not include articles when the authors describe the use of a particular free and open source software for teaching but did not address the principles of openness or re-use.
    6. Doing the review: Due to the methodical approach of going through the reference lists, it is possible to create a map of how the articles relate to each other. This results in thematic clusters and connections between clusters. The starting point for the map were four articles (Cook et al. 2018; Marsden, Thompson, and Plonsky 2017; Petras et al. 2015; Toelch and Ostwald 2018) that I found using the databases and criteria described above. I used yEd to generate the network. „yEd is a powerful desktop application that can be used to quickly and effectively generate high-quality diagrams.” (https://www.yworks.com/products/yed) In the network, arrows show, which articles are cited in an article and which articles are cited by others as well. In addition, I made an initial rough classification of the content using colours. This classification is based on the contents mentioned in the articles’ title and abstract. This rough content classification requires a more exact, i.e., content-based subdivision and

  8. Higher Education Research: A Compilation of Journals and Abstracts 2016

    • zenodo.org
    bin
    Updated Nov 5, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alexandra Hertwig; Alexandra Hertwig (2020). Higher Education Research: A Compilation of Journals and Abstracts 2016 [Dataset]. http://doi.org/10.5281/zenodo.4244488
    Explore at:
    binAvailable download formats
    Dataset updated
    Nov 5, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Alexandra Hertwig; Alexandra Hertwig
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Revised dataset to original publication:

    Hertwig, Alexandra (2017): Higher Education Research: A Compilation of Journals and Abstracts 2016. Kassel: INCHER-Kassel. http://www.uni-kassel.de/einrichtungen/fileadmin/datas/einrichtungen/incher/Higher_Education_Research_-_A_Compilation_of_Journals_and_Abstracts_2016.pdf (accessed November 04, 2020)

    Datasets to volume 2013 to 2018 might vary with regard to covered journals of the original publication. The datasets include journals and respective publication data providing persistent identifiers, explicitly.

    The Research Information Service (RIS) of INCHER-Kassel, Germany provides annual compilation of academic journals since 2013. The datasets allow for further evaluation of single or multiple volumes. For more information on original publications and available datasets please visit INCHER’s RIS websites.

    http://www.uni-kassel.de/einrichtungen/en/incher/risspecial-research-library/ris-documents.html

  9. r

    The Journal of Engineering Research Impact Factor 2024-2025 -...

    • researchhelpdesk.org
    Updated Feb 23, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). The Journal of Engineering Research Impact Factor 2024-2025 - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/impact-factor-if/153/the-journal-of-engineering-research
    Explore at:
    Dataset updated
    Feb 23, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    The Journal of Engineering Research Impact Factor 2024-2025 - ResearchHelpDesk - The Journal of Engineering Research ”TJER” is an open access refereed International publication of Sultan Qaboos University. TJER is envisaged as a refereed international publication of Sultan Qaboos University, Sultanate of Oman. The Journal is to provide a medium through which Engineering Researchers and Scholars from around the world would be able to publish their scholarly applied and/or fundamental research works. This platform provides further information about the Journal activities and publications. You can share your technical and scientific knowledge and experience with others through publications and news releases. This journal is not responsible for opinions printed in its publication. They represent the views of the individuals to whom they are credited and are not binding upon the journal. The Journal of Engineering Research - TJER aims to provide a medium through which engineering researchers and scholars from around the world are able to publish their scholarly applied and/or fundamental research. Contributions of high technical merit are to span the breadth of engineering disciplines. They may cover, the main areas of engineering: Electrical, Electronics, Control and Computer Engineering, Information Engineering and Technology, Mechanical, Industrial and Manufacturing Engineering, Aerospace Engineering, Automation and Mechatronics Engineering, Materials, Chemical and Process Engineering, Civil and Architecture Engineering, Biotechnology and Bio-Engineering, Environmental Engineering, Biological Engineering, Genetic Engineering, Petroleum and Natural Gas Engineering, Mining Engineering and Marine and Agriculture Engineering. TJER - Abstracting and Indexing SCOPUS Google Scholar DOAJ CrossRef EBSCO Jgate LOCKSS Al Manhal ISC Master Journals List

  10. r

    The Journal of Community Health Management CiteScore 2024-2025 -...

    • researchhelpdesk.org
    Updated Aug 3, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). The Journal of Community Health Management CiteScore 2024-2025 - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/sjr/592/the-journal-of-community-health-management
    Explore at:
    Dataset updated
    Aug 3, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    The Journal of Community Health Management CiteScore 2024-2025 - ResearchHelpDesk - The Journal of Community Health Management (JCHM) open access, peer-reviewed quarterly journal publishing since 2014 and is published under the auspices of the Innovative Education and Scientific Research Foundation (IESRF), aim to uplift researchers, scholars, academicians, and professionals in all academic and scientific disciplines. IESRF is dedicated to the transfer of technology and research by publishing scientific journals, research content, providing professional’s membership, and conducting conferences, seminars, and award programs. With the aim of faster and better dissemination of knowledge, we will be publishing the article ‘Ahead of Print’ immediately on acceptance. In addition, the journal would allow free access (Open Access) to its contents, which is likely to attract more readers and citations to articles published in JCHM. Manuscripts must be prepared in accordance with “Uniform requirements for Manuscripts submitted to Biomedical Journals” as per guidelines by the International Committee of Medical Journals Editors (Updated December 2019). The uniform requirements and specific requirements of JCHM are mentioned below. Before sending a manuscript contributors are requested to check for the author guidelines are available from the website of the journal (www.jchm.in/info/author) or directly from (Innovative Pre-Publication Portal) manuscript submission website https://innovpub.org/journal/JCHM Aims and Scope The aim and commitment of the journal are to publish a research-oriented manuscript address significant issues in all the subjects and areas of Community Health Management. Journal is committed itself to improve Education and Research on Acute Care, Bio-statics, Community Health, Epidemiology and Health Services Research, Health Management, Medicine, and Allied branches of Medical Sciences including Health Statistics, Nutrition, Preventive Medicine, Primary Prevention, Primary Health Care, Secondary Prevention, Secondary Healthcare, Tertiary Healthcare, etc. Indexing and Abstracting Information Index Copernicus, Google Scholar, Indian Science Abstracts, National Science Library, J- gate, ROAD, CrossRef, Microsoft Academic, Indian Citation Index (ICI). Journal Ethics Journal is committed to upholding the highest standards of ethical behavior at all stages of publication. We strictly adhere to the industry associations such as the Committee on Publication Ethics (COPE), International Committee of Medical Journal Editors (ICMJE), and World Association of Medical Editors (WAME), that set standards and provide guidelines for best practices in order to meet these requirements. Our specific policies regarding duplicate publication, conflict of interest, patient consent, etc., Please visit the editor guidelines

  11. d

    National Research Foundation of Korea_Academic Journal Number Information

    • data.go.kr
    csv
    Updated Aug 12, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). National Research Foundation of Korea_Academic Journal Number Information [Dataset]. https://www.data.go.kr/en/data/15118475/fileData.do
    Explore at:
    csvAvailable download formats
    Dataset updated
    Aug 12, 2025
    License

    https://data.go.kr/ugs/selectPortalPolicyView.dohttps://data.go.kr/ugs/selectPortalPolicyView.do

    Area covered
    South Korea
    Description

    This data provides the number of journals registered with the Online Journals and Review System (JAMS) and related information, including institutional classification, institution name, journal title, journal abbreviation, number of articles, language used, indexing category, and publication period. This data is useful for understanding the current state of domestic and international journals, analyzing research trends, and verifying journal indexing and publication information. Academic researchers, universities, and academic societies can use this data to select appropriate journals for submission and develop effective submission strategies. It is also utilized in the development of academic information services, contributing to increased accessibility to academic information.

  12. d

    Open access practices of selected library science journals

    • search.dataone.org
    • data.niaid.nih.gov
    • +1more
    Updated May 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jennifer Jordan; Blair Solon; Stephanie Beene (2025). Open access practices of selected library science journals [Dataset]. http://doi.org/10.5061/dryad.pvmcvdnt3
    Explore at:
    Dataset updated
    May 8, 2025
    Dataset provided by
    Dryad Digital Repository
    Authors
    Jennifer Jordan; Blair Solon; Stephanie Beene
    Description

    The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science. The data include journals that are open access, which was first defined by the Budapest Open Access Initiative: By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in English or abstracted in English, 3) actively published at the time of..., Data Collection In the spring of 2023, researchers gathered 377 scholarly journals whose content covered the work of librarians, archivists, and affiliated information professionals. This data encompassed 221 journals from the Proquest database Library and Information Science Abstracts (LISA), widely regarded as an authoritative database in the field of librarianship. From the Directory of Open Access Journals, we included 144 LIS journals. We also included 12 other journals not indexed in DOAJ or LISA, based on the researchers’ knowledge of existing OA library journals. The data is separated into several different sets representing the different indices and journals we searched. The first set includes journals from the database LISA. The following fields are in this dataset:

    Journal: title of the journal

    Publisher: title of the publishing company

    Open Data Policy: lists whether an open data exists and what the policy is

    Country of publication: country where the journal is publ..., , # Open access practices of selected library science journals

    The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science.

    The data include journals that are open access, which was first defined by the Budapest Open Access Initiative:Â

    By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself.

    Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in Engli...

  13. l

    Data from: Where do engineering students really get their information? :...

    • opal.latrobe.edu.au
    • researchdata.edu.au
    pdf
    Updated Mar 13, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Clayton Bolitho (2025). Where do engineering students really get their information? : using reference list analysis to improve information literacy programs [Dataset]. http://doi.org/10.4225/22/59d45f4b696e4
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Mar 13, 2025
    Dataset provided by
    La Trobe
    Authors
    Clayton Bolitho
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    BackgroundAn understanding of the resources which engineering students use to write their academic papers provides information about student behaviour as well as the effectiveness of information literacy programs designed for engineering students. One of the most informative sources of information which can be used to determine the nature of the material that students use is the bibliography at the end of the students’ papers. While reference list analysis has been utilised in other disciplines, few studies have focussed on engineering students or used the results to improve the effectiveness of information literacy programs. Gadd, Baldwin and Norris (2010) found that civil engineering students undertaking a finalyear research project cited journal articles more than other types of material, followed by books and reports, with web sites ranked fourth. Several studies, however, have shown that in their first year at least, most students prefer to use Internet search engines (Ellis & Salisbury, 2004; Wilkes & Gurney, 2009).PURPOSEThe aim of this study was to find out exactly what resources undergraduate students studying civil engineering at La Trobe University were using, and in particular, the extent to which students were utilising the scholarly resources paid for by the library. A secondary purpose of the research was to ascertain whether information literacy sessions delivered to those students had any influence on the resources used, and to investigate ways in which the information literacy component of the unit can be improved to encourage students to make better use of the resources purchased by the Library to support their research.DESIGN/METHODThe study examined student bibliographies for three civil engineering group projects at the Bendigo Campus of La Trobe University over a two-year period, including two first-year units (CIV1EP – Engineering Practice) and one-second year unit (CIV2GR – Engineering Group Research). All units included a mandatory library session at the start of the project where student groups were required to meet with the relevant faculty librarian for guidance. In each case, the Faculty Librarian highlighted specific resources relevant to the topic, including books, e-books, video recordings, websites and internet documents. The students were also shown tips for searching the Library catalogue, Google Scholar, LibSearch (the LTU Library’s research and discovery tool) and ProQuest Central. Subject-specific databases for civil engineering and science were also referred to. After the final reports for each project had been submitted and assessed, the Faculty Librarian contacted the lecturer responsible for the unit, requesting copies of the student bibliographies for each group. References for each bibliography were then entered into EndNote. The Faculty Librarian grouped them according to various facets, including the name of the unit and the group within the unit; the material type of the item being referenced; and whether the item required a Library subscription to access it. A total of 58 references were collated for the 2010 CIV1EP unit; 237 references for the 2010 CIV2GR unit; and 225 references for the 2011 CIV1EP unit.INTERIM FINDINGSThe initial findings showed that student bibliographies for the three group projects were primarily made up of freely available internet resources which required no library subscription. For the 2010 CIV1EP unit, all 58 resources used were freely available on the Internet. For the 2011 CIV1EP unit, 28 of the 225 resources used (12.44%) required a Library subscription or purchase for access, while the second-year students (CIV2GR) used a greater variety of resources, with 71 of the 237 resources used (29.96%) requiring a Library subscription or purchase for access. The results suggest that the library sessions had little or no influence on the 2010 CIV1EP group, but the sessions may have assisted students in the 2011 CIV1EP and 2010 CIV2GR groups to find books, journal articles and conference papers, which were all represented in their bibliographiesFURTHER RESEARCHThe next step in the research is to investigate ways to increase the representation of scholarly references (found by resources other than Google) in student bibliographies. It is anticipated that such a change would lead to an overall improvement in the quality of the student papers. One way of achieving this would be to make it mandatory for students to include a specified number of journal articles, conference papers, or scholarly books in their bibliographies. It is also anticipated that embedding La Trobe University’s Inquiry/Research Quiz (IRQ) using a constructively aligned approach will further enhance the students’ research skills and increase their ability to find suitable scholarly material which relates to their topic. This has already been done successfully (Salisbury, Yager, & Kirkman, 2012)CONCLUSIONS & CHALLENGESThe study shows that most students rely heavily on the free Internet for information. Students don’t naturally use Library databases or scholarly resources such as Google Scholar to find information, without encouragement from their teachers, tutors and/or librarians. It is acknowledged that the use of scholarly resources doesn’t automatically lead to a high quality paper. Resources must be used appropriately and students also need to have the skills to identify and synthesise key findings in the existing literature and relate these to their own paper. Ideally, students should be able to see the benefit of using scholarly resources in their papers, and continue to seek these out even when it’s not a specific assessment requirement, though it can’t be assumed that this will be the outcome.REFERENCESEllis, J., & Salisbury, F. (2004). Information literacy milestones: building upon the prior knowledge of first-year students. Australian Library Journal, 53(4), 383-396.Gadd, E., Baldwin, A., & Norris, M. (2010). The citation behaviour of civil engineering students. Journal of Information Literacy, 4(2), 37-49.Salisbury, F., Yager, Z., & Kirkman, L. (2012). Embedding Inquiry/Research: Moving from a minimalist model to constructive alignment. Paper presented at the 15th International First Year in Higher Education Conference, Brisbane. Retrieved from http://www.fyhe.com.au/past_papers/papers12/Papers/11A.pdfWilkes, J., & Gurney, L. J. (2009). Perceptions and applications of information literacy by first year applied science students. Australian Academic & Research Libraries, 40(3), 159-171.

  14. r

    International Journal of Global Science Research Impact Factor 2024-2025 -...

    • researchhelpdesk.org
    Updated Feb 23, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). International Journal of Global Science Research Impact Factor 2024-2025 - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/impact-factor-if/513/international-journal-of-global-science-research
    Explore at:
    Dataset updated
    Feb 23, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    International Journal of Global Science Research Impact Factor 2024-2025 - ResearchHelpDesk - International Journal of Global Science Research - IJGSR is Open access, Peer-reviewed, Refereed, Indexed, Biannual, Bilingual, Impacted Academic Journal. IJGSR welcomes the submission of manuscripts via its online for the current issue. Submit your resent Research paper, Short communication, Review, Extended versions of already published in Conference/ Journal papers, Academic article, and Letter to the Editor. IJGSR publishes papers on a broad range of topics in the areas of Environmental Sciences, Environmental Ethics, Environmental Legislation, Environmental Impact Assessment, Environmental Management, Environmental Policies, Environmental Pollution, Natural Resources Conservation, Biosciences, Agricultural Science, Anthropology and Behavioral Sciences, Animal Husbandry, Aquaculture, Biodiversity, Biotechnology, Biochemistry, Bioinformatics, Cell, and Molecular Biology, Fish and Fisheries, Home Sciences, Immunology, Life Sciences, Limnology, Medical Sciences, Microbiology, Nutrition, Plant Sciences, Taxonomy, Tissue Culture, Toxicology, Veterinary Sciences, Wildlife Conservation, Zoology, Earth and Atmospheric Sciences, Mineralogy and Wildlife which hold much promise for the future, are also within the scope of International Journal of Global Science Research. The published papers are made highly visible to the scientific community through a wide indexing policy adopted by this online international journal. IJGSR is currently indexed in ISSN Directory, NISCAIR, Road Directory, Google Scholar, INDEX COPERNICUS.

  15. Data from: Are Overall Journal Rankings a Good Mapping for Article Quality...

    • tandf.figshare.com
    pdf
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Melody Lo; Yong Bao (2023). Are Overall Journal Rankings a Good Mapping for Article Quality in Specialty Fields? [Dataset]. http://doi.org/10.6084/m9.figshare.1306954.v2
    Explore at:
    pdfAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Taylor & Francishttps://taylorandfrancis.com/
    Authors
    Melody Lo; Yong Bao
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Overall journal rankings, which are generated with sample articles in different research fields, are commonly used to measure the research productivity of academic economists. In this article, we investigate a growing concern in the profession that the use of the overall journal rankings to evaluate scholars’ relative research productivity may exhibit a downward bias toward researchers in some specialty fields if their respective field journals are under-ranked in the overall journals rankings. To address this concern, we constructed new journal rankings based on the intellectual influence of research in 8 specialty fields using a sample consisting of 26,401 articles published across 60 economics journals from 1998 to 2007. We made various comparisons between the newly constructed journal rankings in specialty fields and the traditional overall journal ranking. Our results show that the overall journal ranking provides a considerably good mapping for the article quality in specialty fields. Supplementary materials for this article are available online.

  16. Z

    Softcite Dataset: A dataset of software mentions in research publications

    • data.niaid.nih.gov
    • zenodo.org
    Updated Jan 17, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Caifan Du (2021). Softcite Dataset: A dataset of software mentions in research publications [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_4444074
    Explore at:
    Dataset updated
    Jan 17, 2021
    Dataset provided by
    Caifan Du
    Patrice Lopez
    James Howison
    Hannah Cohoon
    Description

    The Softcite dataset is a gold-standard dataset of software mentions in research publications, a free resource primarily for software entity recognition in scholarly text. This is the first release of this dataset.

    What's in the dataset

    With the aim of facilitating software entity recognition efforts at scale and eventually increased visibility of research software for the due credit of software contributions to scholarly research, a team of trained annotators from Howison Lab at the University of Texas at Austin annotated 4,093 software mentions in 4,971 open access research publications in biomedicine (from PubMed Central Open Access collection) and economics (from Unpaywall open access services). The annotated software mentions, along with their publisher, version, and access URL, if mentioned in the text, as well as those publications annotated as containing no software mentions, are all included in the released dataset as a TEI/XML corpus file.

    For understanding the schema of the Softcite corpus, its design considerations, and provenance, please refer to our paper included in this release (preprint version).

    Use scenarios

    The release of the Softcite dataset is intended to encourage researchers and stakeholders to make research software more visible in science, especially to academic databases and systems of information retrieval; and facilitate interoperability and collaboration among similar and relevant efforts in software entity recognition and building utilities for software information retrieval. This dataset can also be useful for researchers investigating software use in academic research.

    Current release content

    softcite-dataset v1.0 release includes:

    The Softcite dataset corpus file: softcite_corpus-full.tei.xml

    Softcite Dataset: A Dataset of Software Mentions in Biomedical and Economic Research Publications, our paper that describes the design consideration and creation process of the dataset: Softcite_Dataset_Description_RC.pdf. (This is a preprint version of our forthcoming publication in the Journal of the Association for Information Science and Technology.)

    The Softcite dataset is licensed under a Creative Commons Attribution 4.0 International License.

    If you have questions, please start a discussion or issue in the howisonlab/softcite-dataset Github repository.

  17. Drivers and Barriers for Open Access Publishing - WoS 2016 Dataset

    • zenodo.org
    bin
    Updated Apr 24, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sergio Ruiz-Perez; Sergio Ruiz-Perez (2025). Drivers and Barriers for Open Access Publishing - WoS 2016 Dataset [Dataset]. http://doi.org/10.5281/zenodo.842013
    Explore at:
    binAvailable download formats
    Dataset updated
    Apr 24, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Sergio Ruiz-Perez; Sergio Ruiz-Perez
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Answers to a survey on gold Open Access run from July to October 2016. The dataset contains 15,235 unique responses from Web of Science published authors. This survey is part of a PhD thesis from the University of Granada in Spain. More details about the study can be found in the full text document, also available in Zenodo.

    Following are listed the questions related to the WoS 2016 dataset. Please note that countries with less than 40 answers are listed as "Other" in order to preserve anonymity.

    * 1. How many years have you been employed in research?

    • Fewer than 5 years
    • 5-14 years
    • 15-24 years
    • 25 years or longer

    Many of the questions that follow concern Open Access publishing. For the purposes of this survey, an article is Open Access if its final, peer-reviewed, version is published online by a journal and is free of charge to all users without restrictions on access or use.

    * 2. Do any journals in your research field publish Open Access articles?

    • Yes
    • No
    • I do not know

    * 3. Do you think your research field benefits, or would benefit from journals that publish Open Access articles?

    • Yes
    • No
    • I have no opinion
    • I do not care

    * 4. How many peer reviewed research articles (Open Access or not Open Access) have you published in the last five years?

    • 1-5
    • 6-10
    • 11-20
    • 21-50
    • More than 50

    * 5. What factors are important to you when selecting a journal to publish in?

    [Each factor may be rated “Extremely important”, “Important”, “Less important” or “Irrelevant”. The factors are presented in random order.]

    • Importance of the journal for academic promotion, tenure or assessment
    • Recommendation of the journal by my colleagues
    • Positive experience with publisher/editor(s) of the journal
    • The journal is an Open Access journal
    • Relevance of the journal for my community
    • The journal fits the policy of my organisation
    • Prestige/perceived quality of the journal
    • Likelihood of article acceptance in the journal
    • Absence of journal publication fees (e.g. submission charges, page charges, colour charges)
    • Copyright policy of the journal
    • Journal Impact Factor
    • Speed of publication of the journal

    6. Who usually decides which journals your articles are submitted to? (Choose more than one answer if applicable)

    • The decision is my own
    • A collective decision is made with my fellow authors
    • I am advised where to publish by a senior colleague
    • The organisation that finances my research advises me where to publish
    • Other (please specify) [Text box follows]

    7. Approximately how many Open Access articles have you published in the last five years?

    • 0
    • 1-5
    • 6-10
    • More than 10
    • I do not know

    [If the answer is “0”, the survey jumps to Q10.]

    * 8. What publication fee was charged for the last Open Access article you published?

    • No charge
    • Up to €250 ($275)
    • €251-€500 ($275-$550)
    • €501-€1000 ($551-$1100)
    • €1001-€3000 ($1101-$3300)
    • More than €3000 ($3300)
    • I do not know

    [If the answer is “No charge or I don’t know” the survey jumps to Q20. ]

    * 9. How was this publication fee covered? (Choose more than one answer if applicable)

    • My research funding includes money for paying such fees
    • I used part of my research funding not specifically intended for paying such fees
    • My institution paid the fees
    • I paid the costs myself
    • Other (please specify) [Text box follows]

    * 10. How easy is it to obtain funding if needed for Open Access publishing from your institution or the organisation mainly responsible for financing your research?

    • Easy
    • Difficult
    • I have not used these sources

    * 11. Listed below are a series of statements, both positive and negative, concerning Open Access publishing. Please indicate how strongly you agree/disagree with each statement.

    [Each statement may be rated “Strongly agree”, “Agree”, “Neither agree nor disagree”, “Disagree” or “Strongly disagree”. The statements are presented in random order.]

    • Researchers should retain the rights to their published work and allow it to be used by others
    • Open Access publishing undermines the system of peer review
    • Open Access publishing leads to an increase in the publication of poor quality research
    • If authors pay publication fees to make their articles Open Access, there will be less money available for research
    • It is not beneficial for the general public to have access to published scientific and medical articles
    • Open Access unfairly penalises research-intensive institutions with large publication output by making them pay high costs for publication
    • Publicly-funded research should be made available to be read and used without access barrier
    • Open Access publishing is more cost-effective than subscription-based publishing and so will benefit public investment in research
    • Articles that are available by Open Access are likely to be read and cited more often than those not Open Access

    This study and its questionnaire are based on the SOAP Project (http://project-soap.eu). An article describing the highlights of the SOAP Survey is available at: https://arxiv.org/abs/1101.5260. The dataset of the SOAP survey is available at http://bit.ly/gSmm71. A manual describing the SOAP dataset is available at http://bit.ly/gI8nc.

  18. D

    Data from: Data Papers as a New Form of Knowledge Organization in the Field...

    • ssh.datastations.nl
    ods, pdf, zip
    Updated Jun 7, 2019
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    J. Schöpfel; J. Schöpfel (2019). Data Papers as a New Form of Knowledge Organization in the Field of Research Data [Dataset]. http://doi.org/10.17026/DANS-ZK3-JKYB
    Explore at:
    ods(15303), ods(20739), pdf(216582), zip(18880)Available download formats
    Dataset updated
    Jun 7, 2019
    Dataset provided by
    DANS Data Station Social Sciences and Humanities
    Authors
    J. Schöpfel; J. Schöpfel
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    In order to analyse specific features of data papers, we established a representative sample of data journals, based on lists from the European FOSTER Plus project , the German wiki forschungsdaten.org hosted by the University of Konstanz and two French research organizations.The complete list consists of 82 data journals, i.e. journals which publish data papers. They represent less than 0,5% of academic and scholarly journals. For each of these 82 data journals, we gathered information about the discipline, the global business model, the publisher, peer reviewing etc. The analysis is partly based on data from ProQuest’s Ulrichsweb database, enriched and completed by information available on the journals’ home pages.One part of the data journals are presented as “pure” data journals stricto sensu , i.e. journals which publish exclusively or mainly data papers. We identified 28 journals of this category (34%). For each journal, we assessed through direct search on the journals’ homepages (information about the journal, author’s guidelines etc.) the use of identifiers and metadata, the mode of selection and the business model, and we assessed different parameters of the data papers themselves, such as length, structure, linking etc.The results of this analysis are compared with other research journals (“mixed” data journals) which publish data papers along with regular research articles, in order to identify possible differences between both journal categories, on the level of data papers as well as on the level of the regular research papers. Moreover, the results are discussed against concepts of knowledge organization.

  19. r

    International journal of scientific research Abstract & Indexing -...

    • researchhelpdesk.org
    Updated Jun 18, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). International journal of scientific research Abstract & Indexing - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/abstract-and-indexing/539/international-journal-of-scientific-research
    Explore at:
    Dataset updated
    Jun 18, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    International journal of scientific research Abstract & Indexing - ResearchHelpDesk - IJSR - International Journal of Scientific Research is a Double Reviewed, Peer-Reviewed monthly print journal, accepts intensive and exclusive research works in all spheres of Medical Science from Academicians, Professors, residents in their respective medico field. The journal aims to disseminate high-quality research work in the form of Original Research Papers, Case Reports, Review Reports, etc to the medical fraternity. The quality papers published are inline and acceptable by the Medical Council of India (MCI), Other Statutory Authorities in India and across the World. The journal releases on every 1st of the Month. Open access publishing The IJSR is an open-access publication and its content is, therefore, free for anybody to access online, to read and download, as well as to copy and disseminate for educational purposes. Articles are published immediately upon acceptance and production of the final formatted version. When published online The periphery of the Medical subject areas includes: Anatomy Anesthesiology Ayurveda Biochemistry Cardiology Clinical Research Clinical Science Community Medicine Dental Science Dermatology Diabetology Electrotherapy Endocrinology Endodontic ENT Epidemiology Forensic Medicine Forensic Science Gastroenterology General Medicine General Surgery Genetics Gynaecology Health Science Healthcare Hepatobiliary Surgery Homeopathic Human Genetics Immunohaematology Immunology Medical Physics Medical Science Medicine Microbiology Morphology Neonatology Nephrology Neurology Neurosurgery Nursing Gynaecology Oncology Ophthalmology Oral Medicine Oral Pathology Orthodontology Orthopaedics Paediatrics Pathology Periodontology Pharma Otolaryngology Pharmaceutical Pharmacology Pharmacy Physiology Physiotherapy Plastic Surgery Prosthodontics Psychiatry Pulmonary Medicine Radiodiagnosis Radiology Rehabilitation Science Rheumatology Surgery Unani Medicine Urology Editorial and Peer Review Processes IJSR - International Journal of Scientific Research a double-blind peer review. Referees remain anonymous for the author during the review procedure and the author's name is removed from the manuscript under review. Only after publication, and only with the permission of the referee, the names of the reviewers are published in the article. Each article is first assessed by two of the editors of the editorial board and, if it is judged suitable for IJAR, it is sent to two or three external referees for a double-blind peer review. IJAR uses three different review forms (Research and Theory, Integrated Care Cases, Policy papers) all of which apply scientific criteria and take account of the purpose of the article and its merits for integrated care. Based on the recommendations of the reviewers, the editors then decide whether the paper should be accepted as is, revised or rejected. In the case of revisions, a final decision on publication will be made after resubmission. If there is no agreement on the part of the editors, the editor-in-chief will make the final decision. Abstract & Indexing Google Scholar Index Copernicus(ICV: 78.46) IISS DJOF DRJI Cite Factor ISI Genamics EZ3 Open J-Gate CrossRef

  20. w

    Dataset of books called Evaluating research in academic journals : a...

    • workwithdata.com
    Updated Apr 17, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Work With Data (2025). Dataset of books called Evaluating research in academic journals : a practical guide to realistic evaluation [Dataset]. https://www.workwithdata.com/datasets/books?f=1&fcol0=book&fop0=%3D&fval0=Evaluating+research+in+academic+journals+%3A+a+practical+guide+to+realistic+evaluation
    Explore at:
    Dataset updated
    Apr 17, 2025
    Dataset authored and provided by
    Work With Data
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset is about books. It has 1 row and is filtered where the book is Evaluating research in academic journals : a practical guide to realistic evaluation. It features 7 columns including author, publication date, language, and book publisher.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Abir (2023). Journal Ranking Dataset [Dataset]. https://www.kaggle.com/datasets/xabirhasan/journal-ranking-dataset
Organization logo

Data from: Journal Ranking Dataset

A dataset of journal ranking based on Scimago, Web of Science, and Scopus.

Related Article
Explore at:
CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
Dataset updated
Aug 15, 2023
Dataset provided by
Kaggle
Authors
Abir
License

https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

Description

Journals & Ranking

An academic journal or research journal is a periodical publication in which research articles relating to a particular academic discipline is published, according to Wikipedia. Currently, there are more than 25,000 peer-reviewed journals that are indexed in citation index databases such as Scopus and Web of Science. These indexes are ranked on the basis of various metrics such as CiteScore, H-index, etc. The metrics are calculated from yearly citation data of the journal. A lot of efforts are given to make a metric that reflects the journal's quality.

Journal Ranking Dataset

This is a comprehensive dataset on the academic journals coving their metadata information as well as citation, metrics, and ranking information. Detailed data on their subject area is also given in this dataset. The dataset is collected from the following indexing databases: - Scimago Journal Ranking - Scopus - Web of Science Master Journal List

The data is collected by scraping and then it was cleaned, details of which can be found in HERE.

Key Features

  • Rank: Overall rank of journal (derived from sorted SJR index).
  • Title: Name or title of journal.
  • OA: Open Access or not.
  • Country: Country of origin.
  • SJR-index: A citation index calculated by Scimago.
  • CiteScore: A citation index calculated by Scopus.
  • H-index: Hirsh index, the largest number h such that at least h articles in that journal were cited at least h times each.
  • Best Quartile: Top Q-index or quartile a journal has in any subject area.
  • Best Categories: Subject areas with top quartile.
  • Best Subject Area: Highest ranking subject area.
  • Best Subject Rank: Rank of the highest ranking subject area.
  • Total Docs.: Total number of documents of the journal.
  • Total Docs. 3y: Total number of documents in the past 3 years.
  • Total Refs.: Total number of references of the journal.
  • Total Cites 3y: Total number of citations in the past 3 years.
  • Citable Docs. 3y: Total number of citable documents in the past 3 years.
  • Cites/Doc. 2y: Total number of citations divided by the total number of documents in the past 2 years.
  • Refs./Doc.: Total number of references divided by the total number of documents.
  • Publisher: Name of the publisher company of the journal.
  • Core Collection: Web of Science core collection name.
  • Coverage: Starting year of coverage.
  • Active: Active or inactive.
  • In-Press: Articles in press or not.
  • ISO Language Code: Three-letter ISO 639 code for language.
  • ASJC Codes: All Science Journal Classification codes for the journal.

Rest of the features provide further details on the journal's subject area or category: - Life Sciences: Top level subject area. - Social Sciences: Top level subject area. - Physical Sciences: Top level subject area. - Health Sciences: Top level subject area. - 1000 General: ASJC main category. - 1100 Agricultural and Biological Sciences: ASJC main category. - 1200 Arts and Humanities: ASJC main category. - 1300 Biochemistry, Genetics and Molecular Biology: ASJC main category. - 1400 Business, Management and Accounting: ASJC main category. - 1500 Chemical Engineering: ASJC main category. - 1600 Chemistry: ASJC main category. - 1700 Computer Science: ASJC main category. - 1800 Decision Sciences: ASJC main category. - 1900 Earth and Planetary Sciences: ASJC main category. - 2000 Economics, Econometrics and Finance: ASJC main category. - 2100 Energy: ASJC main category. - 2200 Engineering: ASJC main category. - 2300 Environmental Science: ASJC main category. - 2400 Immunology and Microbiology: ASJC main category. - 2500 Materials Science: ASJC main category. - 2600 Mathematics: ASJC main category. - 2700 Medicine: ASJC main category. - 2800 Neuroscience: ASJC main category. - 2900 Nursing: ASJC main category. - 3000 Pharmacology, Toxicology and Pharmaceutics: ASJC main category. - 3100 Physics and Astronomy: ASJC main category. - 3200 Psychology: ASJC main category. - 3300 Social Sciences: ASJC main category. - 3400 Veterinary: ASJC main category. - 3500 Dentistry: ASJC main category. - 3600 Health Professions: ASJC main category.

Search
Clear search
Close search
Google apps
Main menu