100+ datasets found
  1. Scientific JOURNALS Indicators & Info - SCImagoJR

    • kaggle.com
    Updated Apr 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ali Jalaali (2025). Scientific JOURNALS Indicators & Info - SCImagoJR [Dataset]. https://www.kaggle.com/datasets/alijalali4ai/scimagojr-scientific-journals-dataset
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Apr 9, 2025
    Dataset provided by
    Kaggle
    Authors
    Ali Jalaali
    Description


    The SCImago Journal & Country Rank is a publicly available portal that includes the journals and country scientific indicators developed from the information contained in the Scopus® database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains. Journals can be compared or analysed separately.


    💬Also have a look at
    💡 COUNTRIES Research & Science Dataset - SCImagoJR
    💡 UNIVERSITIES & Research INSTITUTIONS Rank - SCImagoIR

    • Journals can be grouped by subject area (27 major thematic areas), subject category (309 specific subject categories) or by country.
    • Citation data is drawn from over 34,100 titles from more than 5,000 international publishers
    • This platform takes its name from the SCImago Journal Rank (SJR) indicator , developed by SCImago from the widely known algorithm Google PageRank™. This indicator shows the visibility of the journals contained in the Scopus® database from 1996.
    • SCImago is a research group from the Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares, dedicated to information analysis, representation and retrieval by means of visualisation techniques.

    ☢️❓The entire dataset is obtained from public and open-access data of ScimagoJR (SCImago Journal & Country Rank)
    ScimagoJR Journal Rank
    SCImagoJR About Us

    Available indicators:

    • SJR (SCImago Journal Rank) indicator: It expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years, --i.e. weighted citations received in year X to documents published in the journal in years X-1, X-2 and X-3. See detailed description of SJR (PDF).
    • H Index: The h index expresses the journal's number of articles (h) that have received at least h citations. It quantifies both journal scientific productivity and scientific impact and it is also applicable to scientists, countries, etc. (see H-index wikipedia definition)
    • Total Documents: Output of the selected period. All types of documents are considered, including citable and non citable documents.
    • Total Documents (3years): Published documents in the three previous years (selected year documents are excluded), i.e.when the year X is selected, then X-1, X-2 and X-3 published documents are retrieved. All types of documents are considered, including citable and non citable documents.
    • Citable Documents (3 years): Number of citable documents published by a journal in the three previous years (selected year documents are excluded). Exclusively articles, reviews and conference papers are considered. Non-citable Docs. (Available in the graphics) Non-citable documents ratio in the period being considered.
    • Total Cites (3years): Number of citations received in the seleted year by a journal to the documents published in the three previous years, --i.e. citations received in year X to documents published in years X-1, X-2 and X-3. All types of documents are considered.
    • Cites per Document (2 years): Average citations per document in a 2 year period. It is computed considering the number of citations received by a journal in the current year to the documents published in the two previous years, --i.e. citations received in year X to documents published in years X-1 and X-2.
    • Cites per Document (3 years): Average citations per document in a 3 year period. It is computed considering the number of citations received by a journal in the current year to the documents published in the three previous years, --i.e. citations received in year X to documents published in years X-1, X-2 and X-3.
    • Self Cites: Number of journal's self-citations in the seleted year to its own documents published in the three previous years, --i.e. self-citations in year X to documents published in years X-1, X-2 and X-3. All types of documents are considered.
    • Cited Documents: Number of documents cited at least once in the three previous years, --i.e. years X-1, X-2 and X-3
    • Uncited Documents: Number of uncited documents in the three previous years, --i.e. years X-1, X-2 and X-3
    • Total References: It includes all the bibliographical references in a journal in the selected period.
    • References per Document:Average number of references per document in the selected year.
    • % International Collaboration: Document ratio whose affiliation includes more than one country address.
  2. Data articles in journals

    • zenodo.org
    bin, csv, txt
    Updated Sep 21, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Carlota Balsa-Sanchez; Carlota Balsa-Sanchez; Vanesa Loureiro; Vanesa Loureiro (2023). Data articles in journals [Dataset]. http://doi.org/10.5281/zenodo.7458466
    Explore at:
    bin, txt, csvAvailable download formats
    Dataset updated
    Sep 21, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Carlota Balsa-Sanchez; Carlota Balsa-Sanchez; Vanesa Loureiro; Vanesa Loureiro
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Last Version: 4

    Authors: Carlota Balsa-Sánchez, Vanesa Loureiro

    Date of data collection: 2022/12/15

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
    File list:

    - data_articles_journal_list_v4.xlsx: full list of 140 academic journals in which data papers or/and software papers could be published
    - data_articles_journal_list_v4.csv: full list of 140 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 4th version
    - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
    - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Journal Citation Reports (JCR) and/or Scimago Journal and Country Rank (SJR), Scopus and Web of Science (WOS), Journal Master List.

    Version: 3

    Authors: Carlota Balsa-Sánchez, Vanesa Loureiro

    Date of data collection: 2022/10/28

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
    File list:

    - data_articles_journal_list_v3.xlsx: full list of 124 academic journals in which data papers or/and software papers could be published
    - data_articles_journal_list_3.csv: full list of 124 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 3rd version
    - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
    - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Journal Citation Reports (JCR) and/or Scimago Journal and Country Rank (SJR).

    Erratum - Data articles in journals Version 3:

    Botanical Studies -- ISSN 1999-3110 -- JCR (JIF) Q2
    Data -- ISSN 2306-5729 -- JCR (JIF) n/a
    Data in Brief -- ISSN 2352-3409 -- JCR (JIF) n/a

    Version: 2

    Author: Francisco Rubio, Universitat Politècnia de València.

    Date of data collection: 2020/06/23

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
    File list:

    - data_articles_journal_list_v2.xlsx: full list of 56 academic journals in which data papers or/and software papers could be published
    - data_articles_journal_list_v2.csv: full list of 56 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 2nd version
    - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
    - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Scimago Journal and Country Rank (SJR)

    Total size: 32 KB

    Version 1: Description

    This dataset contains a list of journals that publish data articles, code, software articles and database articles.

    The search strategy in DOAJ and Ulrichsweb was the search for the word data in the title of the journals.
    Acknowledgements:
    Xaquín Lores Torres for his invaluable help in preparing this dataset.

  3. Z

    Data articles in journals

    • data.niaid.nih.gov
    Updated Sep 22, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Balsa-Sanchez, Carlota (2023). Data articles in journals [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3753373
    Explore at:
    Dataset updated
    Sep 22, 2023
    Dataset provided by
    Balsa-Sanchez, Carlota
    Loureiro, Vanesa
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Version: 5

    Authors: Carlota Balsa-Sánchez, Vanesa Loureiro

    Date of data collection: 2023/09/05

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers. File list:

    • data_articles_journal_list_v5.xlsx: full list of 140 academic journals in which data papers or/and software papers could be published
    • data_articles_journal_list_v5.csv: full list of 140 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 5th version - Information updated: number of journals, URL, document types associated to a specific journal.

    Version: 4

    Authors: Carlota Balsa-Sánchez, Vanesa Loureiro

    Date of data collection: 2022/12/15

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers. File list:

    • data_articles_journal_list_v4.xlsx: full list of 140 academic journals in which data papers or/and software papers could be published
    • data_articles_journal_list_v4.csv: full list of 140 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 4th version - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Journal Citation Reports (JCR) and/or Scimago Journal and Country Rank (SJR), Scopus and Web of Science (WOS), Journal Master List.

    Version: 3

    Authors: Carlota Balsa-Sánchez, Vanesa Loureiro

    Date of data collection: 2022/10/28

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers. File list:

    • data_articles_journal_list_v3.xlsx: full list of 124 academic journals in which data papers or/and software papers could be published
    • data_articles_journal_list_3.csv: full list of 124 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 3rd version - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Journal Citation Reports (JCR) and/or Scimago Journal and Country Rank (SJR).

    Erratum - Data articles in journals Version 3:

    Botanical Studies -- ISSN 1999-3110 -- JCR (JIF) Q2 Data -- ISSN 2306-5729 -- JCR (JIF) n/a Data in Brief -- ISSN 2352-3409 -- JCR (JIF) n/a

    Version: 2

    Author: Francisco Rubio, Universitat Politècnia de València.

    Date of data collection: 2020/06/23

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers. File list:

    • data_articles_journal_list_v2.xlsx: full list of 56 academic journals in which data papers or/and software papers could be published
    • data_articles_journal_list_v2.csv: full list of 56 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 2nd version - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Scimago Journal and Country Rank (SJR)

    Total size: 32 KB

    Version 1: Description

    This dataset contains a list of journals that publish data articles, code, software articles and database articles.

    The search strategy in DOAJ and Ulrichsweb was the search for the word data in the title of the journals. Acknowledgements: Xaquín Lores Torres for his invaluable help in preparing this dataset.

  4. Data articles in journals

    • zenodo.org
    bin, csv, txt
    Updated Sep 21, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Carlota Balsa-Sanchez; Carlota Balsa-Sanchez; Francisco Rubio; Francisco Rubio (2023). Data articles in journals [Dataset]. http://doi.org/10.5281/zenodo.3921974
    Explore at:
    csv, txt, binAvailable download formats
    Dataset updated
    Sep 21, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Carlota Balsa-Sanchez; Carlota Balsa-Sanchez; Francisco Rubio; Francisco Rubio
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Last Version: 2

    Author: Francisco Rubio, Universitat Politècnia de València.

    Date of data collection: 2020/06/23

    General description: The publication of datasets according to the FAIR principles, could be reached publishing a data paper (or software paper) in data journals or in academic standard journals. The excel and CSV file contains a list of academic journals that publish data papers and software papers.
    File list:

    - data_articles_journal_list_v2.xlsx: full list of 56 academic journals in which data papers or/and software papers could be published
    - data_articles_journal_list_v2.csv: full list of 56 academic journals in which data papers or/and software papers could be published

    Relationship between files: both files have the same information. Two different formats are offered to improve reuse

    Type of version of the dataset: final processed version

    Versions of the files: 2nd version
    - Information updated: number of journals, URL, document types associated to a specific journal, publishers normalization and simplification of document types
    - Information added : listed in the Directory of Open Access Journals (DOAJ), indexed in Web of Science (WOS) and quartile in Scimago Journal and Country Rank (SJR)

    Total size: 32 KB

    Version 1: Description

    This dataset contains a list of journals that publish data articles, code, software articles and database articles.

    The search strategy in DOAJ and Ulrichsweb was the search for the word data in the title of the journals.
    Acknowledgements:
    Xaquín Lores Torres for his invaluable help in preparing this dataset.

  5. Top Universities Computer Science

    • kaggle.com
    Updated Apr 20, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Bogdan Sorin Miauca (2023). Top Universities Computer Science [Dataset]. https://www.kaggle.com/datasets/bogdansorin/top-universities-computer-science
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Apr 20, 2023
    Dataset provided by
    Kaggle
    Authors
    Bogdan Sorin Miauca
    Description

    The dataset contains best universities by computer science subject ranked from 2020 to 2023, the universities can change their rank and scores over the years

    If you found this dataset usefull you can leave a like.

    Columns

    rank:The placement of the university wrt. the others

    year:The year when the university was ranked

    cs_overall_score:The score of the university on computer science subject

    cs_academic_reputation:Score of how famous is the university

    cs_emloyer_reputation:Score of the reputation of the teachers and researchers

    cs_citations_per_paper:Score of how good are the scientific papers

    cs_index_citations:Score of the index of the scientific papers

    cs_international_research:Score of international partecipation

    locatio:Country and city where the university is located

  6. h

    Scimago Journal Rankings

    • hgxjs.org
    • search.webdepozit.sk
    • +6more
    csv
    Updated Oct 7, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Scimago Lab (2024). Scimago Journal Rankings [Dataset]. http://hgxjs.org/journalrank0138.html
    Explore at:
    csvAvailable download formats
    Dataset updated
    Oct 7, 2024
    Dataset authored and provided by
    Scimago Lab
    Description

    Academic journals indicators developed from the information contained in the Scopus database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains.

  7. PLOS Open Science Indicators

    • plos.figshare.com
    zip
    Updated Jul 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Public Library of Science (2025). PLOS Open Science Indicators [Dataset]. http://doi.org/10.6084/m9.figshare.21687686.v10
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 10, 2025
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Public Library of Science
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains article metadata and information about Open Science Indicators for approximately 139,000 research articles published in PLOS journals from 1 January 2018 to 30 March 2025 and a set of approximately 28,000 comparator articles published in non-PLOS journals. This is the tenth release of this dataset, which will be updated with new versions on an annual basis.This version of the Open Science Indicators dataset shares the indicators seen in the previous versions as well as fully operationalised protocols and study registration indicators, which were previously only shared in preliminary forms. The v10 dataset focuses on detection of five Open Science practices by analysing the XML of published research articles:Sharing of research data, in particular data shared in data repositoriesSharing of codePosting of preprintsSharing of protocolsSharing of study registrationsThe dataset provides data and code generation and sharing rates, the location of shared data and code (whether in Supporting Information or in an online repository). It also provides preprint, protocol and study registration sharing rates as well as details of the shared output, such as publication date, URL/DOI/Registration Identifier and platform used. Additional data fields are also provided for each article analysed. This release has been run using an updated preprint detection method (see OSI-Methods-Statement_v10_Jul25.pdf for details). Further information on the methods used to collect and analyse the data can be found in Documentation.Further information on the principles and requirements for developing Open Science Indicators is available in https://doi.org/10.6084/m9.figshare.21640889.Data folders/filesData Files folderThis folder contains the main OSI dataset files PLOS-Dataset_v10_Jul25.csv and Comparator-Dataset_v10_Jul25.csv, which containdescriptive metadata, e.g. article title, publication data, author countries, is taken from the article .xml filesadditional information around the Open Science Indicators derived algorithmicallyand the OSI-Summary-statistics_v10_Jul25.xlsx file contains the summary data for both PLOS-Dataset_v10_Jul25.csv and Comparator-Dataset_v10_Jul25.csv.Documentation folderThis file contains documentation related to the main data files. The file OSI-Methods-Statement_v10_Jul25.pdf describes the methods underlying the data collection and analysis. OSI-Column-Descriptions_v10_Jul25.pdf describes the fields used in PLOS-Dataset_v10_Jul25.csv and Comparator-Dataset_v10_Jul25.csv. OSI-Repository-List_v1_Dec22.xlsx lists the repositories and their characteristics used to identify specific repositories in the PLOS-Dataset_v10_Jul25.csv and Comparator-Dataset_v10_Jul25.csv repository fields.The folder also contains documentation originally shared alongside the preliminary versions of the protocols and study registration indicators in order to give fuller details of their detection methods.Contact details for further information:Iain Hrynaszkiewicz, Director, Open Research Solutions, PLOS, ihrynaszkiewicz@plos.org / plos@plos.orgLauren Cadwallader, Open Research Manager, PLOS, lcadwallader@plos.org / plos@plos.orgAcknowledgements:Thanks to Allegra Pearce, Tim Vines, Asura Enkhbayar, Scott Kerr and parth sarin of DataSeer for contributing to data acquisition and supporting information.

  8. The Assessment of Science: The Relative Merits of Post-Publication Review,...

    • plos.figshare.com
    docx
    Updated Jun 1, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Adam Eyre-Walker; Nina Stoletzki (2023). The Assessment of Science: The Relative Merits of Post-Publication Review, the Impact Factor, and the Number of Citations [Dataset]. http://doi.org/10.1371/journal.pbio.1001675
    Explore at:
    docxAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Adam Eyre-Walker; Nina Stoletzki
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The assessment of scientific publications is an integral part of the scientific process. Here we investigate three methods of assessing the merit of a scientific paper: subjective post-publication peer review, the number of citations gained by a paper, and the impact factor of the journal in which the article was published. We investigate these methods using two datasets in which subjective post-publication assessments of scientific publications have been made by experts. We find that there are moderate, but statistically significant, correlations between assessor scores, when two assessors have rated the same paper, and between assessor score and the number of citations a paper accrues. However, we show that assessor score depends strongly on the journal in which the paper is published, and that assessors tend to over-rate papers published in journals with high impact factors. If we control for this bias, we find that the correlation between assessor scores and between assessor score and the number of citations is weak, suggesting that scientists have little ability to judge either the intrinsic merit of a paper or its likely impact. We also show that the number of citations a paper receives is an extremely error-prone measure of scientific merit. Finally, we argue that the impact factor is likely to be a poor measure of merit, since it depends on subjective assessment. We conclude that the three measures of scientific merit considered here are poor; in particular subjective assessments are an error-prone, biased, and expensive method by which to assess merit. We argue that the impact factor may be the most satisfactory of the methods we have considered, since it is a form of pre-publication review. However, we emphasise that it is likely to be a very error-prone measure of merit that is qualitative, not quantitative.

  9. A study of the impact of data sharing on article citations using journal...

    • plos.figshare.com
    • dataverse.harvard.edu
    • +1more
    docx
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Garret Christensen; Allan Dafoe; Edward Miguel; Don A. Moore; Andrew K. Rose (2023). A study of the impact of data sharing on article citations using journal policies as a natural experiment [Dataset]. http://doi.org/10.1371/journal.pone.0225883
    Explore at:
    docxAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Garret Christensen; Allan Dafoe; Edward Miguel; Don A. Moore; Andrew K. Rose
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This study estimates the effect of data sharing on the citations of academic articles, using journal policies as a natural experiment. We begin by examining 17 high-impact journals that have adopted the requirement that data from published articles be publicly posted. We match these 17 journals to 13 journals without policy changes and find that empirical articles published just before their change in editorial policy have citation rates with no statistically significant difference from those published shortly after the shift. We then ask whether this null result stems from poor compliance with data sharing policies, and use the data sharing policy changes as instrumental variables to examine more closely two leading journals in economics and political science with relatively strong enforcement of new data policies. We find that articles that make their data available receive 97 additional citations (estimate standard error of 34). We conclude that: a) authors who share data may be rewarded eventually with additional scholarly citations, and b) data-posting policies alone do not increase the impact of articles published in a journal unless those policies are enforced.

  10. d

    Open access practices of selected library science journals

    • search.dataone.org
    • data.niaid.nih.gov
    • +1more
    Updated May 8, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jennifer Jordan; Blair Solon; Stephanie Beene (2025). Open access practices of selected library science journals [Dataset]. http://doi.org/10.5061/dryad.pvmcvdnt3
    Explore at:
    Dataset updated
    May 8, 2025
    Dataset provided by
    Dryad Digital Repository
    Authors
    Jennifer Jordan; Blair Solon; Stephanie Beene
    Description

    The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science. The data include journals that are open access, which was first defined by the Budapest Open Access Initiative: By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in English or abstracted in English, 3) actively published at the time of..., Data Collection In the spring of 2023, researchers gathered 377 scholarly journals whose content covered the work of librarians, archivists, and affiliated information professionals. This data encompassed 221 journals from the Proquest database Library and Information Science Abstracts (LISA), widely regarded as an authoritative database in the field of librarianship. From the Directory of Open Access Journals, we included 144 LIS journals. We also included 12 other journals not indexed in DOAJ or LISA, based on the researchers’ knowledge of existing OA library journals. The data is separated into several different sets representing the different indices and journals we searched. The first set includes journals from the database LISA. The following fields are in this dataset:

    Journal: title of the journal

    Publisher: title of the publishing company

    Open Data Policy: lists whether an open data exists and what the policy is

    Country of publication: country where the journal is publ..., , # Open access practices of selected library science journals

    The data in this set was culled from the Directory of Open Access Journals (DOAJ), the Proquest database Library and Information Science Abstracts (LISA), and a sample of peer reviewed scholarly journals in the field of Library Science.

    The data include journals that are open access, which was first defined by the Budapest Open Access Initiative:Â

    By ‘open access’ to [scholarly] literature, we mean its free availability on the public internet, permitting any users to read, download, copy, distribute, print, search, or link to the full texts of these articles, crawl them for indexing, pass them as data to software, or use them for any other lawful purpose, without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself.

    Starting with a batch of 377 journals, we focused our dataset to include journals that met the following criteria: 1) peer-reviewed 2) written in Engli...

  11. d

    Data for: Integrating open education practices with data analysis of open...

    • search.dataone.org
    • data.niaid.nih.gov
    Updated Jul 27, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Marja Bakermans (2024). Data for: Integrating open education practices with data analysis of open science in an undergraduate course [Dataset]. http://doi.org/10.5061/dryad.37pvmcvst
    Explore at:
    Dataset updated
    Jul 27, 2024
    Dataset provided by
    Dryad Digital Repository
    Authors
    Marja Bakermans
    Description

    The open science movement produces vast quantities of openly published data connected to journal articles, creating an enormous resource for educators to engage students in current topics and analyses. However, educators face challenges using these materials to meet course objectives. I present a case study using open science (published articles and their corresponding datasets) and open educational practices in a capstone course. While engaging in current topics of conservation, students trace connections in the research process, learn statistical analyses, and recreate analyses using the programming language R. I assessed the presence of best practices in open articles and datasets, examined student selection in the open grading policy, surveyed students on their perceived learning gains, and conducted a thematic analysis on student reflections. First, articles and datasets met just over half of the assessed fairness practices, but this increased with the publication date. There was a..., Article and dataset fairness To assess the utility of open articles and their datasets as an educational tool in an undergraduate academic setting, I measured the congruence of each pair to a set of best practices and guiding principles. I assessed ten guiding principles and best practices (Table 1), where each category was scored ‘1’ or ‘0’ based on whether it met that criteria, with a total possible score of ten. Open grading policies Students were allowed to specify the percentage weight for each assessment category in the course, including 1) six coding exercises (Exercises), 2) one lead exercise (Lead Exercise), 3) fourteen annotation assignments of readings (Annotations), 4) one final project (Final Project), 5) five discussion board posts and a statement of learning reflection (Discussion), and 6) attendance and participation (Participation). I examined if assessment categories (independent variable) were weighted (dependent variable) differently by students using an analysis of ..., , # Data for: Integrating open education practices with data analysis of open science in an undergraduate course

    Author: Marja H Bakermans Affiliation: Worcester Polytechnic Institute, 100 Institute Rd, Worcester, MA 01609 USA ORCID: https://orcid.org/0000-0002-4879-7771 Institutional IRB approval: IRB-24–0314

    Data and file overview

    The full dataset file called OEPandOSdata (.xlsx extension) contains 8 files. Below are descriptions of the name and contents of each file. NA = not applicable or no data available

    1. BestPracticesData.csv
      • Description: Data to assess the adherence of articles and datasets to open science best practices.
      • Column headers and descriptions:
        • Article: articles used in the study, numbered randomly
        • F1: Findable, Data are assigned a unique and persistent doi
        • F2: Findable, Metadata includes an identifier of data
        • F3: Findable, Data are registered in a searchable database
        • A1: ...
  12. Dataset for: Research data management in academic institutions: a scoping...

    • zenodo.org
    • explore.openaire.eu
    • +1more
    csv, pdf
    Updated Aug 3, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    L Perrier; L Perrier; E Blondal; E Blondal; A. P Ayala; A. P Ayala; D Dearborn; D Dearborn; T Kenny; T Kenny; D Lightfoot; D Lightfoot; R Reka; M Thuna; M Thuna; L Trimble; H MacDonald; H MacDonald; R Reka; L Trimble (2024). Dataset for: Research data management in academic institutions: a scoping review [Dataset]. http://doi.org/10.5281/zenodo.557043
    Explore at:
    csv, pdfAvailable download formats
    Dataset updated
    Aug 3, 2024
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    L Perrier; L Perrier; E Blondal; E Blondal; A. P Ayala; A. P Ayala; D Dearborn; D Dearborn; T Kenny; T Kenny; D Lightfoot; D Lightfoot; R Reka; M Thuna; M Thuna; L Trimble; H MacDonald; H MacDonald; R Reka; L Trimble
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Overview

    This dataset contains the raw data for the manuscript:
    Perrier L, Blondal E, Ayala AP, Dearborn D, Kenny T, Lightfoot D, Reka R, Thuna M, Trimble L, MacDonald H. Research data management in academic institutions: A scoping review. PLOS One. 2017 May 23;12(5):e0178261. doi: 10.1371/journal.pone.0178261.

    Full-text available at: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0178261

    Data and Documentation Files

    Five files make up the dataset:

    1. Data Dictionary: RDMScopingReview_DataDictionary.pdf
    2. Data Abstraction Sheet: RDMScopingReview_StudyCharacteristics.csv
    3. Data Abstraction Sheet: RDMScopingReview_Setting.csv
    4. Data Abstraction Sheet: RDMScopingReview_DataCollectionTools.csv
    5. Data Abstraction Sheet: RDMScopingReview_Outcomes.csv

    Contact: Laure Perrier: orcid.org/0000-0001-9941-7129

  13. Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to...

    • plos.figshare.com
    • figshare.com
    tiff
    Updated May 30, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    B. Ian Hutchins; Xin Yuan; James M. Anderson; George M. Santangelo (2023). Relative Citation Ratio (RCR): A New Metric That Uses Citation Rates to Measure Influence at the Article Level [Dataset]. http://doi.org/10.1371/journal.pbio.1002541
    Explore at:
    tiffAvailable download formats
    Dataset updated
    May 30, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    B. Ian Hutchins; Xin Yuan; James M. Anderson; George M. Santangelo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Despite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network to field-normalize the number of citations it has received. Article citation rates are divided by an expected citation rate that is derived from performance of articles in the same field and benchmarked to a peer comparison group. The resulting Relative Citation Ratio is article level and field independent and provides an alternative to the invalid practice of using journal impact factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010 and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.

  14. n

    SciRev

    • neuinfo.org
    • dknet.org
    • +1more
    Updated Jan 29, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2022). SciRev [Dataset]. http://identifiers.org/RRID:SCR_014016/resolver
    Explore at:
    Dataset updated
    Jan 29, 2022
    Description

    A database where researchers and editors can read and post reviews and ratings of scientific journals and their publishing processes. Efficient journals get credits for their efforts to improve their review process and the way they handle manuscripts, while less efficient journals are stimulated to put energy in better organization. Researchers can search for a journal with a speedy review procedure and have their papers published earlier. Editors get the opportunity to compare their journal's performance with that of others and to provide information about their journal at our website.

  15. r

    American Journal of Clinical Dermatology Acceptance Rate - ResearchHelpDesk

    • researchhelpdesk.org
    Updated Feb 15, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). American Journal of Clinical Dermatology Acceptance Rate - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/acceptance-rate/269/american-journal-of-clinical-dermatology
    Explore at:
    Dataset updated
    Feb 15, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    American Journal of Clinical Dermatology Acceptance Rate - ResearchHelpDesk - The American Journal of Clinical Dermatology promotes evidence-based therapy and effective patient management within the discipline of dermatology by publishing critical and comprehensive review articles and clinically focussed original research articles covering all aspects of the management of dermatological conditions. The American Journal of Clinical Dermatology offers a range of additional enhanced features designed to increase the visibility, readership and educational value of the journal’s content. Each article is accompanied by a Key Points summary, giving a time-efficient overview of the content to a wide readership. Articles may be accompanied by plain language summaries to assist readers in understanding important medical advances. The journal also provides the option to include various other types of enhanced features including slide sets, videos and animations. All enhanced features are peer reviewed to the same high standard as the article itself. Peer review is conducted using Editorial Manager, supported by a database of international experts. This database is shared with other Adis journals. Abstract & indexing Science Citation Index Expanded (SciSearch), Journal Citation Reports/Science Edition, Medline, SCOPUS, Google Scholar, Current Contents/Clinical Medicine, EBSCO Academic Search, EBSCO Advanced Placement Source, EBSCO CINAHL, EBSCO Discovery Service, EBSCO STM Source, EBSCO TOC Premier, Expanded Academic, Institute of Scientific and Technical Information of China, Japanese Science and Technology Agency (JST), Naver, OCLC WorldCat Discovery Service, Pathway Studio, ProQuest Central, ProQuest Health & Medical Collection, ProQuest Health Research Premium Collection, ProQuest Medical Database, ProQuest-ExLibris Primo, ProQuest-ExLibris Summon, Semantic Scholar

  16. Z

    Data from: The Open Review–Based (ORB) dataset: Towards Automatic Assessment...

    • data.niaid.nih.gov
    Updated Sep 2, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Lamine Bougueroua (2024). The Open Review–Based (ORB) dataset: Towards Automatic Assessment of Scientific Papers and Experiment Proposals in High–Energy Physics [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_8053076
    Explore at:
    Dataset updated
    Sep 2, 2024
    Dataset provided by
    Pierre Jouvelot
    Federico Ravotti
    Jaroslaw Szumega
    Blerina Gkotse
    Lamine Bougueroua
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    We introduce the new comprehensive Open Review–Based dataset (ORB); it includes a curated list of more than 62,000 scientific papers, publications and submitted preprints with their more than 157,000 reviews and final decisions. We gather this information from three peer-reviewed sources: the OpenReview.net, SciPost.org and PeerJ.com websites.

  17. h

    scientific_papers

    • huggingface.co
    • tensorflow.org
    Updated Feb 21, 2021
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Arman Cohan (2021). scientific_papers [Dataset]. https://huggingface.co/datasets/armanc/scientific_papers
    Explore at:
    Dataset updated
    Feb 21, 2021
    Authors
    Arman Cohan
    License

    https://choosealicense.com/licenses/unknown/https://choosealicense.com/licenses/unknown/

    Description

    Scientific papers datasets contains two sets of long and structured documents. The datasets are obtained from ArXiv and PubMed OpenAccess repositories.

    Both "arxiv" and "pubmed" have two features: - article: the body of the document, pagragraphs seperated by "/n". - abstract: the abstract of the document, pagragraphs seperated by "/n". - section_names: titles of sections, seperated by "/n".

  18. m

    World’s Top 2% of Scientists list by Stanford University: An Analysis of its...

    • data.mendeley.com
    Updated Nov 17, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    JOHN Philip (2023). World’s Top 2% of Scientists list by Stanford University: An Analysis of its Robustness [Dataset]. http://doi.org/10.17632/td6tdp4m6t.1
    Explore at:
    Dataset updated
    Nov 17, 2023
    Authors
    JOHN Philip
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    John Ioannidis and co-authors [1] created a publicly available database of top-cited scientists in the world. This database, intended to address the misuse of citation metrics, has generated a lot of interest among the scientific community, institutions, and media. Many institutions used this as a yardstick to assess the quality of researchers. At the same time, some people look at this list with skepticism citing problems with the methodology used. Two separate databases are created based on career-long and, single recent year impact. This database is created using Scopus data from Elsevier[1-3]. The Scientists included in this database are classified into 22 scientific fields and 174 sub-fields. The parameters considered for this analysis are total citations from 1996 to 2022 (nc9622), h index in 2022 (h22), c-score, and world rank based on c-score (Rank ns). Citations without self-cites are considered in all cases (indicated as ns). In the case of a single-year case, citations during 2022 (nc2222) instead of Nc9622 are considered.

    To evaluate the robustness of c-score-based ranking, I have done a detailed analysis of the matrix parameters of the last 25 years (1998-2022) of Nobel laureates of Physics, chemistry, and medicine, and compared them with the top 100 rank holders in the list. The latest career-long and single-year-based databases (2022) were used for this analysis. The details of the analysis are presented below: Though the article says the selection is based on the top 100,000 scientists by c-score (with and without self-citations) or a percentile rank of 2% or above in the sub-field, the actual career-based ranking list has 204644 names[1]. The single-year database contains 210199 names. So, the list published contains ~ the top 4% of scientists. In the career-based rank list, for the person with the lowest rank of 4809825, the nc9622, h22, and c-score were 41, 3, and 1.3632, respectively. Whereas for the person with the No.1 rank in the list, the nc9622, h22, and c-score were 345061, 264, and 5.5927, respectively. Three people on the list had less than 100 citations during 96-2022, 1155 people had an h22 less than 10, and 6 people had a C-score less than 2.
    In the single year-based rank list, for the person with the lowest rank (6547764), the nc2222, h22, and c-score were 1, 1, and 0. 6, respectively. Whereas for the person with the No.1 rank, the nc9622, h22, and c-score were 34582, 68, and 5.3368, respectively. 4463 people on the list had less than 100 citations in 2022, 71512 people had an h22 less than 10, and 313 people had a C-score less than 2. The entry of many authors having single digit H index and a very meager total number of citations indicates serious shortcomings of the c-score-based ranking methodology. These results indicate shortcomings in the ranking methodology.

  19. Document level disaggregated data about evaluation and publication delay in...

    • datos.cchs.csic.es
    csv, txt
    Updated Apr 16, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Agencia Estatal Consejo Superior de Investigaciones Científicas (CSIC) (2025). Document level disaggregated data about evaluation and publication delay in Ibero-American scientific journals (2018-2020) - Datos abiertos CCHS [Dataset]. http://doi.org/10.20350/digitalCSIC/14628
    Explore at:
    csv, txtAvailable download formats
    Dataset updated
    Apr 16, 2025
    Dataset provided by
    Spanish National Research Councilhttp://www.csic.es/
    Authors
    Agencia Estatal Consejo Superior de Investigaciones Científicas (CSIC)
    License

    Attribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
    License information was derived automatically

    Description

    Document level disaggregated data of the review, acceptance and publication dates of a sample of 21890 articles from 326 Ibero-American scientific journals from all subject areas and countries included in the Latindex Catalogue 2.0 and published between 2018 and 2020. The variable included are: document identifier; identifier of the journal section in which the document was include; literal of journal section in which the document was include; source data; year of the publication of the article; reception date of the article; acceptance date of the article; publication date of the article; days between reception date and acceptance date; days between acceptance date and publication date; days between reception date and publication date; identifier of the country/region of the journal; literal of the country/region of the journal; subject area identifier; subject area literal; journal periodicity identifier; journal periodicity; journal identifier; ISSN; journal title. Descripción: Document level disaggregated dataset about evaluation and publication delay in Ibero-Aamerican scientific journals (2018-2020). This dataset has a Creative Commons BY-NC-SA 4.0 licence

  20. n

    Data from: JournalReviewer

    • neuinfo.org
    Updated Aug 4, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2015). JournalReviewer [Dataset]. http://identifiers.org/RRID:SCR_014014
    Explore at:
    Dataset updated
    Aug 4, 2015
    Description

    A database which allows researchers to view and comment on the logistics of scientific journals' review processes. JournalReviewer collects and aggregates feedback from users who have submitted manuscripts to journals in order to provide information for others considering journal submissions. Statistics such as turnaround rate, review length and quality, journal recommendations, and desk reject plausabilities are compiled from commentors' ratings.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Ali Jalaali (2025). Scientific JOURNALS Indicators & Info - SCImagoJR [Dataset]. https://www.kaggle.com/datasets/alijalali4ai/scimagojr-scientific-journals-dataset
Organization logo

Scientific JOURNALS Indicators & Info - SCImagoJR

Scientific Journals Detailed Dataset: Ranking, Indicators & Attributes

Explore at:
CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
Dataset updated
Apr 9, 2025
Dataset provided by
Kaggle
Authors
Ali Jalaali
Description


The SCImago Journal & Country Rank is a publicly available portal that includes the journals and country scientific indicators developed from the information contained in the Scopus® database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains. Journals can be compared or analysed separately.


💬Also have a look at
💡 COUNTRIES Research & Science Dataset - SCImagoJR
💡 UNIVERSITIES & Research INSTITUTIONS Rank - SCImagoIR

  • Journals can be grouped by subject area (27 major thematic areas), subject category (309 specific subject categories) or by country.
  • Citation data is drawn from over 34,100 titles from more than 5,000 international publishers
  • This platform takes its name from the SCImago Journal Rank (SJR) indicator , developed by SCImago from the widely known algorithm Google PageRank™. This indicator shows the visibility of the journals contained in the Scopus® database from 1996.
  • SCImago is a research group from the Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares, dedicated to information analysis, representation and retrieval by means of visualisation techniques.

☢️❓The entire dataset is obtained from public and open-access data of ScimagoJR (SCImago Journal & Country Rank)
ScimagoJR Journal Rank
SCImagoJR About Us

Available indicators:

  • SJR (SCImago Journal Rank) indicator: It expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years, --i.e. weighted citations received in year X to documents published in the journal in years X-1, X-2 and X-3. See detailed description of SJR (PDF).
  • H Index: The h index expresses the journal's number of articles (h) that have received at least h citations. It quantifies both journal scientific productivity and scientific impact and it is also applicable to scientists, countries, etc. (see H-index wikipedia definition)
  • Total Documents: Output of the selected period. All types of documents are considered, including citable and non citable documents.
  • Total Documents (3years): Published documents in the three previous years (selected year documents are excluded), i.e.when the year X is selected, then X-1, X-2 and X-3 published documents are retrieved. All types of documents are considered, including citable and non citable documents.
  • Citable Documents (3 years): Number of citable documents published by a journal in the three previous years (selected year documents are excluded). Exclusively articles, reviews and conference papers are considered. Non-citable Docs. (Available in the graphics) Non-citable documents ratio in the period being considered.
  • Total Cites (3years): Number of citations received in the seleted year by a journal to the documents published in the three previous years, --i.e. citations received in year X to documents published in years X-1, X-2 and X-3. All types of documents are considered.
  • Cites per Document (2 years): Average citations per document in a 2 year period. It is computed considering the number of citations received by a journal in the current year to the documents published in the two previous years, --i.e. citations received in year X to documents published in years X-1 and X-2.
  • Cites per Document (3 years): Average citations per document in a 3 year period. It is computed considering the number of citations received by a journal in the current year to the documents published in the three previous years, --i.e. citations received in year X to documents published in years X-1, X-2 and X-3.
  • Self Cites: Number of journal's self-citations in the seleted year to its own documents published in the three previous years, --i.e. self-citations in year X to documents published in years X-1, X-2 and X-3. All types of documents are considered.
  • Cited Documents: Number of documents cited at least once in the three previous years, --i.e. years X-1, X-2 and X-3
  • Uncited Documents: Number of uncited documents in the three previous years, --i.e. years X-1, X-2 and X-3
  • Total References: It includes all the bibliographical references in a journal in the selected period.
  • References per Document:Average number of references per document in the selected year.
  • % International Collaboration: Document ratio whose affiliation includes more than one country address.
Search
Clear search
Close search
Google apps
Main menu