4 datasets found
  1. s

    Scimago Journal Rankings

    • scimagojr.com
    • vnufulimi.com
    • +9more
    csv
    Updated Jun 26, 2017
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Scimago Lab (2017). Scimago Journal Rankings [Dataset]. https://www.scimagojr.com/journalrank.php
    Explore at:
    csvAvailable download formats
    Dataset updated
    Jun 26, 2017
    Dataset authored and provided by
    Scimago Lab
    Description

    Academic journals indicators developed from the information contained in the Scopus database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains.

  2. Top 100-Ranked Clinical Journals' Preprint Policies as of April 23, 2020

    • data.niaid.nih.gov
    • zenodo.org
    • +1more
    zip
    Updated Sep 6, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dorothy Massey; Joshua Wallach; Joseph Ross; Michelle Opare; Harlan Krumholz (2020). Top 100-Ranked Clinical Journals' Preprint Policies as of April 23, 2020 [Dataset]. http://doi.org/10.5061/dryad.jdfn2z38f
    Explore at:
    zipAvailable download formats
    Dataset updated
    Sep 6, 2020
    Dataset provided by
    Yale School of Public Health
    Yale New Haven Hospital
    Yale University
    Authors
    Dorothy Massey; Joshua Wallach; Joseph Ross; Michelle Opare; Harlan Krumholz
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Description

    Objective: To determine the top 100-ranked (by impact factor) clinical journals' policies toward publishing research previously published on preprint servers (preprints).

    Design: Cross sectional. Main outcome measures: Editorial guidelines toward preprints, journal rank by impact factor.

    Results: 86 (86%) of the journals examined will consider papers previously published as preprints (preprints), 13 (13%) determine their decision on a case-by-case basis, and 1 (1%) does not allow preprints.

    Conclusions: We found wide acceptance of publishing preprints in the clinical research community, although researchers may still face uncertainty that their preprints will be accepted by all of their target journals.

    Methods We examined journal policies of the 100 top-ranked clinical journals using the 2018 impact factors as reported by InCites Journal Citation Reports (JCR). First, we examined all journals with an impact factor greater than 5, and then we manually screened by title and category do identify the first 100 clinical journals. We included only those that publish original research. Next, we checked each journal's editorial policy on preprints. We examined, in order, the journal website, the publisher website, the Transpose Database, and the first 10 pages of a Google search with the journal name and the term "preprint." We classified each journal's policy, as shown in this dataset, as allowing preprints, determining based on preprint status on a case-by-case basis, and not allowing any preprints. We collected data on April 23, 2020.

    (Full methods can also be found in previously published paper.)

  3. d

    Figures for: Global Scientific Production, International Cooperation, and...

    • search.dataone.org
    • dataverse.harvard.edu
    Updated Nov 8, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Lyu, Penghui; Zhang, Mingze; Liu, Chuanjun; Ngai, Eric W.T. (2023). Figures for: Global Scientific Production, International Cooperation, and Knowledge Evolution of Public Administration [Dataset]. http://doi.org/10.7910/DVN/WLPPRE
    Explore at:
    Dataset updated
    Nov 8, 2023
    Dataset provided by
    Harvard Dataverse
    Authors
    Lyu, Penghui; Zhang, Mingze; Liu, Chuanjun; Ngai, Eric W.T.
    Description

    Public administration is a discipline with considerable history, and is also a diverse, interdisciplinary field in social science. To analyze its evolution, discover the present research foci, and predict future development trends, this study applied scientometrics visualization technology to evaluate over 72,000 scientific articles from the 1920s to 2020s. This research referred to the SSCI and JCR databases to gather scientific data of the discipline and the journals’ impact factor. Consequently, paper citations, cited journals, journal co-citations, author co-citations, authoritative papers, top countries, productive institutes, average references, and research collaboration trends were analyzed on the bases of the published literature. This study found top productive journals in the discipline, discovered productive countries and institutes, present the research foci, and predicted future development trends. Through this study, scientific production, international cooperation, and knowledge evolution mode of public administration research offers a clear knowledge map of the public administration discipline.

  4. Gosselin_SCIREP_Data

    • figshare.com
    xml
    Updated Jan 25, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Romain Gosselin (2021). Gosselin_SCIREP_Data [Dataset]. http://doi.org/10.6084/m9.figshare.13385621.v2
    Explore at:
    xmlAvailable download formats
    Dataset updated
    Jan 25, 2021
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Romain Gosselin
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    A mixed sampling methodology was implemented (Figure 1) to collect journals and articles. First, a selection filter was applied within the Institute for Scientific Information (ISI) Journal Citation Report (https://jcr.clarivate.com) database to generate a list of 504 life science journals. Then, exclusion criteria were applied to the journal list and 245 periodicals were removed. Filters and exclusion criteria are given in Table 1. Using a pseudo-random sequence of 20 numbers between 1 and 259 generated using GraphPad QuickCalc (https://www.graphpad.com/quickcalcs/randMenu), a final shortlist of 20 journals among the 259 preselected ordered by decreasing 2018 Impact Factor were selected (the latest available impact factor at the time of designing this study). Four additional journals were finally excluded either because they were eventually found to be too clinical or because there was no online access granted to the author’s institution, leading to a final list of 16 periodicals (Table 3). Clinical journals were excluded although they may include publications with some preclinical experiments. This was justified to prevent the possible bias created by both the presumed small proportion of such articles in clinical periodicals which would have prompted a larger sampling and the supposed compliance of these studies with clinical guidelines whose standards may be different 29,30. Fifteen articles per journal were collected by sampling the online contents of each journal, starting from the last issue released in 2019 and browsing backward. This time window was selected to avoid the abundant literature on Coronavirus disease 2019 (Covid-19) published since January 2020, which might show unusual statistical standards. Article inclusion and exclusion criteria are presented in Table 2. Studies using human data were acceptable when they used ex-vivo/in-vitro approaches for extracting tissues, cells or samples. From this intermediate list of 240 articles, 17 were finally excluded during the analysis due to previously unnoticed violations of inclusion criteria or for congruity with exclusion criteria, resulting in a final sample set that included 223 articles. Assessment of reportingEach article was explored, and three types of statistical attributes were quantified (Table 4). Indicators of the transparency of study protocols were binary items coded as 0 (presence of all needed information in the text) or 1 (absence of information in the text for at least one figure or table) and were aggregated as proportions of articles that had an insufficiency (non-disclosure) for the given item. The indicators were chosen as the minimum set of information needed by a reader to replicate the statistical protocol: precise sample size (experimental units), well identified test, software and no contradiction. A contradictory information is defined as a mismatch between information provided in different parts of the manuscript although they refer to the same object, such as the disclosure of dissimilar statistical tests (in methods and figure legends) to describe the analysis in one figure or the disclosure of multiple sample sizes for one single set of data. The article structure was assessed using quantitative items, specified as total counts of given items as well as one binary outcome (presence of a statistical paragraph). Qualitative items represented the article content and have been summarised as an inventory of information of interest. In the sampled articles, supplemental methods and information were considered full-fledged methodological information, but supplementary figures and tables presenting results were not eligible for the quantification of statistical insufficiencies, even if they were used to report location tests.

  5. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Scimago Lab (2017). Scimago Journal Rankings [Dataset]. https://www.scimagojr.com/journalrank.php

Scimago Journal Rankings

Explore at:
csvAvailable download formats
Dataset updated
Jun 26, 2017
Dataset authored and provided by
Scimago Lab
Description

Academic journals indicators developed from the information contained in the Scopus database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains.

Search
Clear search
Close search
Google apps
Main menu