51 datasets found
  1. Data from: Journal Ranking Dataset

    • kaggle.com
    Updated Aug 15, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Abir (2023). Journal Ranking Dataset [Dataset]. https://www.kaggle.com/datasets/xabirhasan/journal-ranking-dataset
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Aug 15, 2023
    Dataset provided by
    Kaggle
    Authors
    Abir
    License

    https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

    Description

    Journals & Ranking

    An academic journal or research journal is a periodical publication in which research articles relating to a particular academic discipline is published, according to Wikipedia. Currently, there are more than 25,000 peer-reviewed journals that are indexed in citation index databases such as Scopus and Web of Science. These indexes are ranked on the basis of various metrics such as CiteScore, H-index, etc. The metrics are calculated from yearly citation data of the journal. A lot of efforts are given to make a metric that reflects the journal's quality.

    Journal Ranking Dataset

    This is a comprehensive dataset on the academic journals coving their metadata information as well as citation, metrics, and ranking information. Detailed data on their subject area is also given in this dataset. The dataset is collected from the following indexing databases: - Scimago Journal Ranking - Scopus - Web of Science Master Journal List

    The data is collected by scraping and then it was cleaned, details of which can be found in HERE.

    Key Features

    • Rank: Overall rank of journal (derived from sorted SJR index).
    • Title: Name or title of journal.
    • OA: Open Access or not.
    • Country: Country of origin.
    • SJR-index: A citation index calculated by Scimago.
    • CiteScore: A citation index calculated by Scopus.
    • H-index: Hirsh index, the largest number h such that at least h articles in that journal were cited at least h times each.
    • Best Quartile: Top Q-index or quartile a journal has in any subject area.
    • Best Categories: Subject areas with top quartile.
    • Best Subject Area: Highest ranking subject area.
    • Best Subject Rank: Rank of the highest ranking subject area.
    • Total Docs.: Total number of documents of the journal.
    • Total Docs. 3y: Total number of documents in the past 3 years.
    • Total Refs.: Total number of references of the journal.
    • Total Cites 3y: Total number of citations in the past 3 years.
    • Citable Docs. 3y: Total number of citable documents in the past 3 years.
    • Cites/Doc. 2y: Total number of citations divided by the total number of documents in the past 2 years.
    • Refs./Doc.: Total number of references divided by the total number of documents.
    • Publisher: Name of the publisher company of the journal.
    • Core Collection: Web of Science core collection name.
    • Coverage: Starting year of coverage.
    • Active: Active or inactive.
    • In-Press: Articles in press or not.
    • ISO Language Code: Three-letter ISO 639 code for language.
    • ASJC Codes: All Science Journal Classification codes for the journal.

    Rest of the features provide further details on the journal's subject area or category: - Life Sciences: Top level subject area. - Social Sciences: Top level subject area. - Physical Sciences: Top level subject area. - Health Sciences: Top level subject area. - 1000 General: ASJC main category. - 1100 Agricultural and Biological Sciences: ASJC main category. - 1200 Arts and Humanities: ASJC main category. - 1300 Biochemistry, Genetics and Molecular Biology: ASJC main category. - 1400 Business, Management and Accounting: ASJC main category. - 1500 Chemical Engineering: ASJC main category. - 1600 Chemistry: ASJC main category. - 1700 Computer Science: ASJC main category. - 1800 Decision Sciences: ASJC main category. - 1900 Earth and Planetary Sciences: ASJC main category. - 2000 Economics, Econometrics and Finance: ASJC main category. - 2100 Energy: ASJC main category. - 2200 Engineering: ASJC main category. - 2300 Environmental Science: ASJC main category. - 2400 Immunology and Microbiology: ASJC main category. - 2500 Materials Science: ASJC main category. - 2600 Mathematics: ASJC main category. - 2700 Medicine: ASJC main category. - 2800 Neuroscience: ASJC main category. - 2900 Nursing: ASJC main category. - 3000 Pharmacology, Toxicology and Pharmaceutics: ASJC main category. - 3100 Physics and Astronomy: ASJC main category. - 3200 Psychology: ASJC main category. - 3300 Social Sciences: ASJC main category. - 3400 Veterinary: ASJC main category. - 3500 Dentistry: ASJC main category. - 3600 Health Professions: ASJC main category.

  2. Top 100-Ranked Clinical Journals' Preprint Policies as of April 23, 2020

    • zenodo.org
    • data.niaid.nih.gov
    • +1more
    bin
    Updated Jun 3, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dorothy Massey; Dorothy Massey; Joshua Wallach; Joseph Ross; Joseph Ross; Michelle Opare; Harlan Krumholz; Harlan Krumholz; Joshua Wallach; Michelle Opare (2022). Top 100-Ranked Clinical Journals' Preprint Policies as of April 23, 2020 [Dataset]. http://doi.org/10.5061/dryad.jdfn2z38f
    Explore at:
    binAvailable download formats
    Dataset updated
    Jun 3, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Dorothy Massey; Dorothy Massey; Joshua Wallach; Joseph Ross; Joseph Ross; Michelle Opare; Harlan Krumholz; Harlan Krumholz; Joshua Wallach; Michelle Opare
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Objective: To determine the top 100-ranked (by impact factor) clinical journals' policies toward publishing research previously published on preprint servers (preprints).

    Design: Cross sectional. Main outcome measures: Editorial guidelines toward preprints, journal rank by impact factor.

    Results: 86 (86%) of the journals examined will consider papers previously published as preprints (preprints), 13 (13%) determine their decision on a case-by-case basis, and 1 (1%) does not allow preprints.

    Conclusions: We found wide acceptance of publishing preprints in the clinical research community, although researchers may still face uncertainty that their preprints will be accepted by all of their target journals.

  3. g

    Research Data in Core Journals in Biology, Chemistry, Mathematics, and...

    • datasearch.gesis.org
    • openicpsr.org
    Updated Aug 27, 2016
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Womack, Ryan (2016). Research Data in Core Journals in Biology, Chemistry, Mathematics, and Physics [2014] [Dataset]. http://doi.org/10.3886/E45086V1
    Explore at:
    Dataset updated
    Aug 27, 2016
    Dataset provided by
    da|ra (Registration agency for social science and economic data)
    Authors
    Womack, Ryan
    Description

    Supplementary data files associated with this study, which takes a stratified random sample of articles published in 2014 from the top 10 journals in the disciplines of biology, chemistry, mathematics, and physics, as ranked by impact factor. Sampled articles were examined for their reporting of original data or reuse of prior data, and were coded for whether the data was publicly shared or otherwise made available to readers. Other characteristics such as the sharing of software code used for analysis and use of data citation and DOIs for data were examined. The study finds that data sharing practices are still relatively rare in these disciplines’ top journals, but that the disciplines have markedly different practices. Biology shares original data at the highest rate, and physics shares at the lowest rate. Overall, the study finds that only 13% of articles with original data published in 2014 make the data available to others.

  4. H

    Data from: Position of Ultrasonography in the scholarly journal network...

    • dataverse.harvard.edu
    Updated May 5, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sun Huh (2020). Position of Ultrasonography in the scholarly journal network based on bibliometrics and developmental strategies for it to become a top-tier journal [Dataset]. http://doi.org/10.7910/DVN/ZMJN2P
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    May 5, 2020
    Dataset provided by
    Harvard Dataverse
    Authors
    Sun Huh
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    Purpose: This study aimed to clarify the present position of Ultrasonography in the scholarly journal network with a variety of bibliometric indicators. Furthermore, developmental strategies for Ultrasonography to become a top-tier journal are suggested. Methods: The following bibliometric indicators were analyzed: number of citable articles, countries of authors, total cites, impact factor, Hirsch index, authors’ countries and source titles of citing articles, and the titles of sources cited by articles in Ultrasonography. Results: The annual number of citable articles was consistently 40 from 2014 to 2019. The number of countries of authors increased to 22 in 2018-2019. The numbers of total cites reached 632 in Web of Science, 595 in Scopus, and 552 in the Crossref metadata in 2019. The estimated 2-year impact factor soared from 2.15 in 2016 to 3.20 in 2019. The Hirsch index was 20 in both Scopus and the Web of Science Core Collection. Authors from 76 countries cited Ultrasonography. The number of source titles of citing articles was 668, and the number of source titles cited by articles in Ultrasonography was 1,246. Conclusion: The above bibliometric results show that Ultrasonography has become a top-tier journal in its field. Ultrasonography furnishes an example of how after changing its language policy to English-only, a local society journal became a highly cited journal in a short period of time. For further development of the journal, adoption of a data-sharing policy is recommended. Furthermore, indexation in MEDLINE should be pursued in the near future.

  5. File S1 - The Scientific Impact of Nations: Journal Placement and Citation...

    • plos.figshare.com
    pdf
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Matthew J. Smith; Cody Weinberger; Emilio M. Bruna; Stefano Allesina (2023). File S1 - The Scientific Impact of Nations: Journal Placement and Citation Performance [Dataset]. http://doi.org/10.1371/journal.pone.0109195.s001
    Explore at:
    pdfAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Matthew J. Smith; Cody Weinberger; Emilio M. Bruna; Stefano Allesina
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Contains supporting information Figures S1–S4. Figure S1. Empirical cumulative distribution functions for all fields. As Figure 1 in main text. The empirical cumulative distribution functions for journal placement (top) and citation performance (bottom) tiers have been plotted for each field. Articles are grouped according to the number of countries included in the affiliation. Figure S2. Effect of country affiliation for all fields. As figure 2 in main text. Effect of country of affiliation on journal placement and citation performance. The color and length of the bars represent the strength of the effect compared to papers originating from the US. The coefficients are obtained fitting either a proportional-odds model (top) or linear model (bottom) to either journal placement (left) or citation performance (right) for each field. Figure S3. Relationship between journal placement and citation performance in all fields. As figure 3 in main text. For each field, countries are positioned according to their ranks in journal placement and citation performance. Under each plot is the Spearman's Rho and associated p value for the relationship. In addition to the rankings for the proportional-odds model (top, shown in the main text for Ecology and Condensed-Matter Physics) the rankings for the linear model (bottom) are also plotted for each field. Figure S4. Proportion of publications and citations through time for all fields and countries. As Figure 4 in main text. Proportion of publications and citations through time. For each year, we computed the proportion of papers published (long-dashed red line) and the proportion of citations received as of May 2013 (solid orange line). We also report the expected proportion of citations (short-dashed black line) and corresponding (95%) confidence intervals (blue shades) obtained by randomization of citation records within Journal: Year combinations. All countries that have published in all eight fields analyzed are included. (PDF)

  6. U

    Data Sharing Policies in Social Sciences Academic Journals: Evolving...

    • dataverse-staging.rdmc.unc.edu
    • datasearch.gesis.org
    Updated Feb 29, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Robert O'Reilly; Joel Herndon; Robert O'Reilly; Joel Herndon (2024). Data Sharing Policies in Social Sciences Academic Journals: Evolving Expectations of Data Sharing as a Form of Scholarly Communication [Dataset]. http://doi.org/10.15139/S3/12157
    Explore at:
    xls(40448), text/x-stata-syntax; charset=us-ascii(11274), text/plain; charset=us-ascii(25382), xls(39936)Available download formats
    Dataset updated
    Feb 29, 2024
    Dataset provided by
    UNC Dataverse
    Authors
    Robert O'Reilly; Joel Herndon; Robert O'Reilly; Joel Herndon
    License

    https://dataverse-staging.rdmc.unc.edu/api/datasets/:persistentId/versions/2.2/customlicense?persistentId=doi:10.15139/S3/12157https://dataverse-staging.rdmc.unc.edu/api/datasets/:persistentId/versions/2.2/customlicense?persistentId=doi:10.15139/S3/12157

    Time period covered
    2003 - 2015
    Description

    This study consists of data files that code the data availability policies of top-20 academic journals in the fields of Business & Finance, Economics, International Relations, Political Science, and Sociology. Journals that were ranked as top-20 titles based on 2003-vintage ISI Impact Factor scores were coded on their data policies in 2003 and on their data policies in 2015. In addition, journals that were ranked as top-20 titles based on most recent ISI Impact Factor scores were likewise coded on their data polices in 2015. The included Stata .do file imports the contents of each of the Excel files, cleans and labels the data, and produces two tables: one comparing the data policies of 2003-vintage top-20 journals in 2003 those journals' policies in 2015, and one comparing the data policies of 2003-vintage top-20 journals in 2003 to the data policies of current top-20 journals in 2015.

  7. Best Economics Departments 2018

    • dataandsons.com
    csv, zip
    Updated Aug 22, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sean Lux (2018). Best Economics Departments 2018 [Dataset]. https://www.dataandsons.com/data-market/social-sciences/best-economics-departments-2018
    Explore at:
    csv, zipAvailable download formats
    Dataset updated
    Aug 22, 2018
    Dataset provided by
    Authors
    Sean Lux
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2014 - Dec 31, 2017
    Description

    About this Dataset

    Data & Sons recently completed our analysis of top tier economics journal publications from 2014 to 2017 and is pleased to announce the world’s best Economics Departments based on faculty research productivity. We found the best Economics Departments in 2018. The complete list of departments publication totals and all economics publications in the top five economics journals are included. Methodology and Top 20 ranking are available in Documents.

    Category

    Social Sciences

    Keywords

    best economics department,best economics departments

    Row Count

    1647

    Price

    Free

  8. Data from: Industry payments to physician journal editors

    • zenodo.org
    • datadryad.org
    bin
    Updated Jun 2, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Michael Callaham; Michael Callaham; Victoria Wong; Victoria Wong (2022). Industry payments to physician journal editors [Dataset]. http://doi.org/10.7272/q6pk0dbk
    Explore at:
    binAvailable download formats
    Dataset updated
    Jun 2, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Michael Callaham; Michael Callaham; Victoria Wong; Victoria Wong
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Background: Open Paymentsis a United States federal program mandating reporting of medical industry payments to physicians, thereby increasing transparency of physician conflicts of interest (COI).Study objectives were to assess industry payments to physician-editors, and to compare their financial COI rate to all physicians within the specialty.

    Methods and Findings: We performed a retrospective analysis of prospectively collected data, reviewing Open Paymentsfrom August 1, 2013 to December 31, 2016. We reviewed general payments ("payments… not made in connection with a research agreement") and research funding to "top tier" physician-editors of highly-cited medical journals. We compared payments to physician-editors and to physicians-by-specialty. In 35 journals, 333 (74.5%) of 447 "top tier" editors met inclusion criteria (US-based physician-editors). Of these, 212 (63.7%) received industry-associated payments in the study period. In an average year, 141 (42.3%) of physician-editors received any direct payments (to themselves rather than their institutions; includes general payments and research payments), 66 (19.8%) received direct payments >$5,000 (threshold designated by the National Institutes of Health as a Significant Financial Interest) and 51 (15.3%) received direct payments >$10,000. Mean annual general payments to physician-editors was $55,157 (median 3,512, standard deviation 561,885, range 10-10,981,153). Median general payments to physician-editors were mostly higher compared to all physicians within their specialty. Mean annual direct research payment to the physician-editor was $14,558 (median 4,000, standard deviation 34,471, range 15-174,440), and mean annual indirect research funding to the physician-editor's institution was $175,282 (median 49,107, standard deviation 479,480, range 0.18-5,000,000). The main study limitation was difficulty in identifying "top tier" physician-editors. Though we aimed to study physician-editors primarily responsible for making manuscript decisions, we were unable to confirm each editor's role.

    Conclusions: A substantial minority of physician-editors receive payments from industry within any given year, and most editors received payment of some kind during the four-year study period. There were significant outliers. Given the extent of editors' influences on the medical literature, more robust and accessible editor financial COI declarations are recommended.

  9. Descriptives of 284 publications from medium and top tier biomedical...

    • plos.figshare.com
    xls
    Updated Jun 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gerben ter Riet; Paula Chesley; Alan G. Gross; Lara Siebeling; Patrick Muggensturm; Nadine Heller; Martin Umbehr; Daniela Vollenweider; Tsung Yu; Elie A. Akl; Lizzy Brewster; Olaf M. Dekkers; Ingrid Mühlhauser; Bernd Richter; Sonal Singh; Steven Goodman; Milo A. Puhan (2023). Descriptives of 284 publications from medium and top tier biomedical journals used to count and classify limitations and calculate the hedging scores. [Dataset]. http://doi.org/10.1371/journal.pone.0073623.t001
    Explore at:
    xlsAvailable download formats
    Dataset updated
    Jun 3, 2023
    Dataset provided by
    PLOShttp://plos.org/
    Authors
    Gerben ter Riet; Paula Chesley; Alan G. Gross; Lara Siebeling; Patrick Muggensturm; Nadine Heller; Martin Umbehr; Daniela Vollenweider; Tsung Yu; Elie A. Akl; Lizzy Brewster; Olaf M. Dekkers; Ingrid Mühlhauser; Bernd Richter; Sonal Singh; Steven Goodman; Milo A. Puhan
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Numbers are medians and (in brackets) interquartile ranges unless indicated otherwise; RCT = randomized controlled trial; Raw scores indicate the number of hedges in a publication (weighted by a hedging weight between 1 and 5); the hedging score is calculated by dividing the raw score by the number of words in (the relevant sections of) the publication. A hedging score of 3.0% indicates that on every 100 words there is one expression of uncertainty with a weight of 3 (or three with a hedging weight of 1, or less than 1, but with a hedging weight higher than 3, that is, expressing more uncertainty).

  10. Best Economics PhD Program Rankings

    • dataandsons.com
    csv, zip
    Updated Aug 16, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sean Lux (2018). Best Economics PhD Program Rankings [Dataset]. https://www.dataandsons.com/categories/social-sciences/best-economics-phd-program-rankings
    Explore at:
    csv, zipAvailable download formats
    Dataset updated
    Aug 16, 2018
    Dataset provided by
    Authors
    Sean Lux
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2014 - Dec 31, 2017
    Description

    About this Dataset

    Data & Sons recently completed our analysis of top tier economics journal publications from 2014 to 2017 and is pleased to announce the world’s top Economics PhD Programs based on alumni productivity.

    Category

    Social Sciences

    Keywords

    economics phd programs,best phd programs,best economics phd programs

    Row Count

    1491

    Price

    Free

  11. Top Management PhD Programs

    • dataandsons.com
    csv, zip
    Updated Sep 7, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sean Lux (2018). Top Management PhD Programs [Dataset]. https://www.dataandsons.com/categories/education/top-management-phd-prrograms
    Explore at:
    csv, zipAvailable download formats
    Dataset updated
    Sep 7, 2018
    Dataset provided by
    Authors
    Sean Lux
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2014 - Dec 31, 2017
    Description

    About this Dataset

    The number of alumni publications in top tier management journals provides an indication of the quaity of training (and selection) of those programs. We totaled publications in top tier management journals from 2014 to 2017 published by the alumni of management PhD Programs. All articles published in Academy of Management Journal, Academy of Management Review, Administrative Science Quarterly, Journal of Management, and Organization Science were collected and summarized in this dataset.

    Category

    Education

    Keywords

    top management PhD Programs,best management phd programs

    Row Count

    3373

    Price

    Free

  12. s

    Data from: Scimago Institutions Rankings

    • scimagoir.com
    • 0221.com.ar
    • +1more
    csv
    Updated Sep 25, 2009
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Scimago Lab (2009). Scimago Institutions Rankings [Dataset]. https://www.scimagoir.com/
    Explore at:
    csvAvailable download formats
    Dataset updated
    Sep 25, 2009
    Dataset authored and provided by
    Scimago Lab
    Description

    The SCImago Institutions Rankings (SIR) is a classification of academic and research-related institutions ranked by a composite indicator that combines three different sets of indicators based on research performance, innovation outputs and societal impact measured by their web visibility. It provides a friendly interface that allows the visualization of any customized ranking from the combination of these three sets of indicators. Additionally, it is possible to compare the trends for individual indicators of up to six institutions. For each large sector it is also possible to obtain distribution charts of the different indicators. For comparative purposes, the value of the composite indicator has been set on a scale of 0 to 100. However the line graphs and bar graphs always represent ranks (lower is better, so the highest values are the worst).

  13. o

    Replication data for: Restructuring Research: Communication Costs and the...

    • openicpsr.org
    Updated Sep 1, 2008
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ajay Agrawal; Avi Goldfarb (2008). Replication data for: Restructuring Research: Communication Costs and the Democratization of University Innovation [Dataset]. http://doi.org/10.3886/E113260V1
    Explore at:
    Dataset updated
    Sep 1, 2008
    Dataset provided by
    American Economic Association
    Authors
    Ajay Agrawal; Avi Goldfarb
    Description

    We report evidence that Bitnet adoption facilitated increased research collaboration between US universities. However, not all institutions benefited equally. Using panel data from seven top engineering journals, Bitnet connection records, and institution ranking data, we find that middle-tier universities were the primary beneficiaries; they benefited largely by increasing their collaboration with top-tier schools. Furthermore, we find that the magnitude of this effect is greatest for co-located pairs. Thus, the advent of Bitnet – and likely of subsequent networks – seems to have increased the role of middle-tier universities as producers of new knowledge in the national innovation system. (JEL D85, I23, O31, O33)

  14. f

    Data from: Expert-Driven and Citational Approaches to Assessing Journal...

    • scielo.figshare.com
    jpeg
    Updated Jun 3, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Lorena Guadalupe Barberia; Danilo Praxedes Barboza; Samuel Ralize Godoy (2023). Expert-Driven and Citational Approaches to Assessing Journal Publications of Brazilian Political Scientists [Dataset]. http://doi.org/10.6084/m9.figshare.6007787.v1
    Explore at:
    jpegAvailable download formats
    Dataset updated
    Jun 3, 2023
    Dataset provided by
    SciELO journals
    Authors
    Lorena Guadalupe Barberia; Danilo Praxedes Barboza; Samuel Ralize Godoy
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    In this study, we seek to contribute to discussions on how the quality of academic production in the field of political science should be evaluated using Brazil as a case study. We contrast the 'expert-driven approach' that is followed by CAPES, an agency of the Brazilian federal government with the 'citational' approach, which is based on the ranking of journals by mainstream indices of scientific research impact. With data provided by CAPES from 2010 to 2014, we examine to what extent journals that are ranked as having high quality by CAPES also have high impact indexes in the SCImago Journal rank index (SJR), the Hirsch index (h-index) calculated by SCImago, the h5-index and h5-median (based on the h-index period 05 years, calculated by Google Scholar Metrics), and the SNIP indicator (calculated by the CWTS Journal Indicators, included in the Scopus database). Our findings show that there is a positive, but weak correlation between citational criteria and the Qualis evaluation of the same journals. In ordered logistic regressions, we show that a journal's past Qualis scores are the most important factor for explaining its grades in the next evaluation. We show that once a journal's past Qualis score is considered, a journal's citational ranking does not influence its Qualis score with the exception of the SJR in the 2013-4 evaluation. Moreover, a journal's Qualis score is not influenced by the country of publication, language, or social science focus, all else equal.

  15. u

    Data from: Inventory of online public databases and repositories holding...

    • agdatacommons.nal.usda.gov
    • s.cnmilf.com
    • +4more
    txt
    Updated Feb 8, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Erin Antognoli; Jonathan Sears; Cynthia Parr (2024). Inventory of online public databases and repositories holding agricultural data in 2017 [Dataset]. http://doi.org/10.15482/USDA.ADC/1389839
    Explore at:
    txtAvailable download formats
    Dataset updated
    Feb 8, 2024
    Dataset provided by
    Ag Data Commons
    Authors
    Erin Antognoli; Jonathan Sears; Cynthia Parr
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    United States agricultural researchers have many options for making their data available online. This dataset aggregates the primary sources of ag-related data and determines where researchers are likely to deposit their agricultural data. These data serve as both a current landscape analysis and also as a baseline for future studies of ag research data. Purpose As sources of agricultural data become more numerous and disparate, and collaboration and open data become more expected if not required, this research provides a landscape inventory of online sources of open agricultural data. An inventory of current agricultural data sharing options will help assess how the Ag Data Commons, a platform for USDA-funded data cataloging and publication, can best support data-intensive and multi-disciplinary research. It will also help agricultural librarians assist their researchers in data management and publication. The goals of this study were to

    establish where agricultural researchers in the United States-- land grant and USDA researchers, primarily ARS, NRCS, USFS and other agencies -- currently publish their data, including general research data repositories, domain-specific databases, and the top journals compare how much data is in institutional vs. domain-specific vs. federal platforms determine which repositories are recommended by top journals that require or recommend the publication of supporting data ascertain where researchers not affiliated with funding or initiatives possessing a designated open data repository can publish data

    Approach The National Agricultural Library team focused on Agricultural Research Service (ARS), Natural Resources Conservation Service (NRCS), and United States Forest Service (USFS) style research data, rather than ag economics, statistics, and social sciences data. To find domain-specific, general, institutional, and federal agency repositories and databases that are open to US research submissions and have some amount of ag data, resources including re3data, libguides, and ARS lists were analysed. Primarily environmental or public health databases were not included, but places where ag grantees would publish data were considered.
    Search methods We first compiled a list of known domain specific USDA / ARS datasets / databases that are represented in the Ag Data Commons, including ARS Image Gallery, ARS Nutrition Databases (sub-components), SoyBase, PeanutBase, National Fungus Collection, i5K Workspace @ NAL, and GRIN. We then searched using search engines such as Bing and Google for non-USDA / federal ag databases, using Boolean variations of “agricultural data” /“ag data” / “scientific data” + NOT + USDA (to filter out the federal / USDA results). Most of these results were domain specific, though some contained a mix of data subjects. We then used search engines such as Bing and Google to find top agricultural university repositories using variations of “agriculture”, “ag data” and “university” to find schools with agriculture programs. Using that list of universities, we searched each university web site to see if their institution had a repository for their unique, independent research data if not apparent in the initial web browser search. We found both ag specific university repositories and general university repositories that housed a portion of agricultural data. Ag specific university repositories are included in the list of domain-specific repositories. Results included Columbia University – International Research Institute for Climate and Society, UC Davis – Cover Crops Database, etc. If a general university repository existed, we determined whether that repository could filter to include only data results after our chosen ag search terms were applied. General university databases that contain ag data included Colorado State University Digital Collections, University of Michigan ICPSR (Inter-university Consortium for Political and Social Research), and University of Minnesota DRUM (Digital Repository of the University of Minnesota). We then split out NCBI (National Center for Biotechnology Information) repositories. Next we searched the internet for open general data repositories using a variety of search engines, and repositories containing a mix of data, journals, books, and other types of records were tested to determine whether that repository could filter for data results after search terms were applied. General subject data repositories include Figshare, Open Science Framework, PANGEA, Protein Data Bank, and Zenodo. Finally, we compared scholarly journal suggestions for data repositories against our list to fill in any missing repositories that might contain agricultural data. Extensive lists of journals were compiled, in which USDA published in 2012 and 2016, combining search results in ARIS, Scopus, and the Forest Service's TreeSearch, plus the USDA web sites Economic Research Service (ERS), National Agricultural Statistics Service (NASS), Natural Resources and Conservation Service (NRCS), Food and Nutrition Service (FNS), Rural Development (RD), and Agricultural Marketing Service (AMS). The top 50 journals' author instructions were consulted to see if they (a) ask or require submitters to provide supplemental data, or (b) require submitters to submit data to open repositories. Data are provided for Journals based on a 2012 and 2016 study of where USDA employees publish their research studies, ranked by number of articles, including 2015/2016 Impact Factor, Author guidelines, Supplemental Data?, Supplemental Data reviewed?, Open Data (Supplemental or in Repository) Required? and Recommended data repositories, as provided in the online author guidelines for each the top 50 journals. Evaluation We ran a series of searches on all resulting general subject databases with the designated search terms. From the results, we noted the total number of datasets in the repository, type of resource searched (datasets, data, images, components, etc.), percentage of the total database that each term comprised, any dataset with a search term that comprised at least 1% and 5% of the total collection, and any search term that returned greater than 100 and greater than 500 results. We compared domain-specific databases and repositories based on parent organization, type of institution, and whether data submissions were dependent on conditions such as funding or affiliation of some kind. Results A summary of the major findings from our data review:

    Over half of the top 50 ag-related journals from our profile require or encourage open data for their published authors. There are few general repositories that are both large AND contain a significant portion of ag data in their collection. GBIF (Global Biodiversity Information Facility), ICPSR, and ORNL DAAC were among those that had over 500 datasets returned with at least one ag search term and had that result comprise at least 5% of the total collection.
    Not even one quarter of the domain-specific repositories and datasets reviewed allow open submission by any researcher regardless of funding or affiliation.

    See included README file for descriptions of each individual data file in this dataset. Resources in this dataset:Resource Title: Journals. File Name: Journals.csvResource Title: Journals - Recommended repositories. File Name: Repos_from_journals.csvResource Title: TDWG presentation. File Name: TDWG_Presentation.pptxResource Title: Domain Specific ag data sources. File Name: domain_specific_ag_databases.csvResource Title: Data Dictionary for Ag Data Repository Inventory. File Name: Ag_Data_Repo_DD.csvResource Title: General repositories containing ag data. File Name: general_repos_1.csvResource Title: README and file inventory. File Name: README_InventoryPublicDBandREepAgData.txt

  16. Leading newspapers in the U.S. 2017-2019, by circulation

    • statista.com
    Updated Jul 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Leading newspapers in the U.S. 2017-2019, by circulation [Dataset]. https://www.statista.com/statistics/184682/us-daily-newspapers-by-circulation/
    Explore at:
    Dataset updated
    Jul 9, 2025
    Dataset authored and provided by
    Statistahttp://statista.com/
    Time period covered
    Sep 2017 - Jan 2019
    Area covered
    United States
    Description

    The circulation figures for daily newspapers in the United States reveal that USA Today distributed the most papers as of January 2019, with a daily circulation of over **** million. Back in September 2017, The Wall Street Journal ranked first, with circulation figures far outperforming The New York Times, Chicago Tribune and New York Post. Although the 2019 data shows that all daily newspapers in the top ten ranking saw a decrease in circulation since 2017, the Chicago Tribune was hit the hardest. Despite making the top ten list in both years, the paper’s daily circulation decreased by almost *** thousand.

    The decline of newspaper circulation

    It is no secret that print is floundering whilst at the same time digital media is flourishing. However, the impact of falling circulation figures on newspapers has an effect on advertising revenue, as circulation is one of the metrics used to determine and set advertising rates. In 2017, the estimated revenue of newspaper publishers in the United States was almost seven billion dollars lower than in 2010, despite several years of reasonably consistent revenue in terms of subscription and sales.

    Similarly, print newspaper publishing revenue in the U.S. dropped by almost *** million dollars in seven years, whilst revenue for online newspapers showed positive growth.

  17. Management Department Research Productivity

    • dataandsons.com
    csv, zip
    Updated Sep 7, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sean Lux (2018). Management Department Research Productivity [Dataset]. https://www.dataandsons.com/categories/education/management-department-research-productivity
    Explore at:
    zip, csvAvailable download formats
    Dataset updated
    Sep 7, 2018
    Dataset provided by
    Authors
    Sean Lux
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Time period covered
    Jan 1, 2014 - Dec 31, 2017
    Description

    About this Dataset

    Total publications in top tier management journals from 2014 to 2017 summarized by management departments. Totals articles published is the sum of all artciels published in Academy of Management Journal, Academy of Management Review, Administrative Science Quarterly, Journal of Management, and Organization Science by department faculty.

    Category

    Education

    Keywords

    management department,management professor,top management professors,top management departments

    Row Count

    3606

    Price

    Free

  18. Z

    Data from: Annotated Dataset for Uncertainty Mining : Gold Standard

    • data.niaid.nih.gov
    Updated Nov 13, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Gutehrlé, Nicolas (2024). Annotated Dataset for Uncertainty Mining : Gold Standard [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_14134214
    Explore at:
    Dataset updated
    Nov 13, 2024
    Dataset provided by
    Ningrum, Panggih Kusuma
    Atanassova, Iana
    Gutehrlé, Nicolas
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Description of the dataset

    In order to study the expression of uncertainty in scientific articles, we have put together an interdisciplinary corpus of journals in the fields of Science, Technology and Medicine (STM) and the Humanities and Social Sciences (SHS). The selection of journals in our corpus is based on the Scimago Journal and Country Rank (SJR) classification, which is based on Scopus, the largest academic database available online. We have selected journals covering various disciplines, such as medicine, biochemistry, genetics and molecular biology, computer science, social sciences, environmental sciences, psychology, arts and humanities. For each discipline, we selected the five highest-ranked journals. In addition, we have included the journals PLoS ONE and Nature, both of which are interdisciplinary and highly ranked.

    Based on the corpus of articles from different disciplines described above, we created a set of annotated sentences as follows:

    593 were pre-selected automatically, by studying the occurrences of the lists of uncertainty indices proposed by Bongelli et al. (2019), Chen et al. (2018) and Hyland (1996).

    The remaining sentences were extracted from a subset of articles, consisting of two randomly selected articles per journal. These articles were examined by two human annotators to identify sentences containing uncertainty and to annotate them.

    600 sentences not expressing scientific uncertainty were manually identified and reviewed by two annotators

    The sentences were annotated by two independent annotators following the annotation guide proposed by Ningrum and Atanassova (2024). The annotators were trained on the basis of an annotation guide and previously annotated sentences in order to guarantee the consistency of the annotations. Each sentence was annotated as expressing or not expressing uncertainty (Uncertainty and No Uncertainty).Sentences expressing uncertainty were then annotated along five dimensions: Reference , Nature, Context , Timeline and Expression. The annotators reached an average agreement score of 0.414 according to Cohen's Kappa test, which shows the difficulty of the task of annotating scientific uncertainty.Finally, conflicting annotations were resolved by a third independent annotator.

    Our final corpus thus consists of a total of 1 840 sentences from 496 articles in 21 English-language journals from 8 different disciplines.The columns of the table are as follows:

    journal: name of the journal from where the article originates

    article_title: title of the article from where the sentence is extracted

    publication_year: year of publication of the article

    sentence_text: text of the sentence expressing or not expressing uncertainty

    uncertainty: 1 if the sentence expresses uncertainty and 0 otherwise;

    ref, nature, context, timeline, expression: annotations of the type of uncertainty according to the annotation framework proposed by Ningrum and Atanassova (2023). The annotation of each dimension in this dataset are in numeric format rather than textual. The mapping betwen textual and numeric labels is presented in the Table below.

    Dimension 1 2 3 4 5

    Reference Author Former Both

    Nature Epistemic Aleatory Both

    Context Background Methods Res&Disc Conclusion Others

    Timeline Past Present Future

    Expression Quantified Unquantified

    This gold standard has been produced as part of the ANR InSciM (Modelling Uncertainty in Science) project.

    References

    Bongelli, R., Riccioni, I., Burro, R., & Zuczkowski, A. (2019). Writers’ uncertainty in scientific and popular biomedical articles. A comparative analysis of the British Medical Journal and Discover Magazine [Publisher: Public Library of Science]. PLoS ONE, 14 (9). https://doi.org/10.1371/journal.pone.0221933

    Chen, C., Song, M., & Heo, G. E. (2018). A scalable and adaptive method for finding semantically equivalent cue words of uncertainty. Journal of Informetrics, 12 (1), 158–180. https://doi.org/10.1016/j.joi.2017.12.004

    Hyland, K. E. (1996). Talking to the academy forms of hedging in science research articles [Publisher: SAGE Publications Inc.]. Written Communication, 13 (2), 251–281. https://doi.org/10.1177/0741088396013002004

    Ningrum, P. K., & Atanassova, I. (2023). Scientific Uncertainty: An Annotation Framework and Corpus Study in Different Disciplines. 19th International Conference of the International Society for Scientometrics and Informetrics (ISSI 2023). https://doi.org/10.5281/zenodo.8306035

    Ningrum, P. K., & Atanassova, I. (2024). Annotation of scientific uncertainty using linguistic patterns. Scientometrics. https://doi.org/10.1007/s11192-024-05009-z

  19. r

    A meta-analysis of published research into what kinds of university teaching...

    • researchdata.edu.au
    Updated 2011
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Trishita Kordyban; Shelley Kinash (2011). A meta-analysis of published research into what kinds of university teaching lead to improvements in learning [Dataset]. http://doi.org/10.4225/57/58AD1F058B51F
    Explore at:
    Dataset updated
    2011
    Dataset provided by
    BOND University
    Bond University
    Authors
    Trishita Kordyban; Shelley Kinash
    Description

    This dataset is the result of a quantitative and qualitative meta-analysis of higher education research published in 18 top-tier journals between 1991-2011. One hundred and thirty-one published papers met the conditions of: peer-reviewed journal articles; research participants are students of higher education; empirical research; articulate a goal of collecting evidence of learning or seeking to confirm that students are learning; learning outcomes as dependent variable. The data is in spreadsheet, SPSS and NVIVO format.

  20. WORLD UNIVERSITY RANKING 2022 - 2023

    • kaggle.com
    Updated Sep 30, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Aman Chauhan (2022). WORLD UNIVERSITY RANKING 2022 - 2023 [Dataset]. https://www.kaggle.com/whenamancodes/world-university-ranking-2022-2023/discussion
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    Sep 30, 2022
    Dataset provided by
    Kaggle
    Authors
    Aman Chauhan
    Description

    ::: METHODOLOGY :::

    The Center for World University Rankings (CWUR) publishes the only academic ranking of global universities that assesses the quality of education, alumni employment, quality of faculty, and research performance without relying on surveys and university data submissions.

    CWUR uses seven objective and robust indicators grouped into four areas to rank the world’s universities: - Education: based on the academic success of a university’s alumni, and measured by the number of a university's alumni who have won prestigious academic distinctions relative to the university's size (25%) - Employability: based on the professional success of a university’s alumni, and measured by the number of a university's alumni who have held top positions at major companies relative to the university's size (25%) - Faculty: measured by the number of faculty members who have won prestigious academic distinctions (10%) - Research: i) Research Output: measured by the total number of research papers (10%) ii) High-Quality Publications: measured by the number of research papers appearing in top-tier journals (10%) iii) Influence: measured by the number of research papers appearing in highly-influential journals (10%) iv) Citations: measured by the number of highly-cited research papers (10%)

    More - Find More Exciting🙀 Datasets Here - An Upvote👍 A Dayᕙ(`▿´)ᕗ , Keeps Aman Hurray Hurray..... ٩(˘◡˘)۶Haha

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Abir (2023). Journal Ranking Dataset [Dataset]. https://www.kaggle.com/datasets/xabirhasan/journal-ranking-dataset
Organization logo

Data from: Journal Ranking Dataset

A dataset of journal ranking based on Scimago, Web of Science, and Scopus.

Related Article
Explore at:
CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
Dataset updated
Aug 15, 2023
Dataset provided by
Kaggle
Authors
Abir
License

https://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/

Description

Journals & Ranking

An academic journal or research journal is a periodical publication in which research articles relating to a particular academic discipline is published, according to Wikipedia. Currently, there are more than 25,000 peer-reviewed journals that are indexed in citation index databases such as Scopus and Web of Science. These indexes are ranked on the basis of various metrics such as CiteScore, H-index, etc. The metrics are calculated from yearly citation data of the journal. A lot of efforts are given to make a metric that reflects the journal's quality.

Journal Ranking Dataset

This is a comprehensive dataset on the academic journals coving their metadata information as well as citation, metrics, and ranking information. Detailed data on their subject area is also given in this dataset. The dataset is collected from the following indexing databases: - Scimago Journal Ranking - Scopus - Web of Science Master Journal List

The data is collected by scraping and then it was cleaned, details of which can be found in HERE.

Key Features

  • Rank: Overall rank of journal (derived from sorted SJR index).
  • Title: Name or title of journal.
  • OA: Open Access or not.
  • Country: Country of origin.
  • SJR-index: A citation index calculated by Scimago.
  • CiteScore: A citation index calculated by Scopus.
  • H-index: Hirsh index, the largest number h such that at least h articles in that journal were cited at least h times each.
  • Best Quartile: Top Q-index or quartile a journal has in any subject area.
  • Best Categories: Subject areas with top quartile.
  • Best Subject Area: Highest ranking subject area.
  • Best Subject Rank: Rank of the highest ranking subject area.
  • Total Docs.: Total number of documents of the journal.
  • Total Docs. 3y: Total number of documents in the past 3 years.
  • Total Refs.: Total number of references of the journal.
  • Total Cites 3y: Total number of citations in the past 3 years.
  • Citable Docs. 3y: Total number of citable documents in the past 3 years.
  • Cites/Doc. 2y: Total number of citations divided by the total number of documents in the past 2 years.
  • Refs./Doc.: Total number of references divided by the total number of documents.
  • Publisher: Name of the publisher company of the journal.
  • Core Collection: Web of Science core collection name.
  • Coverage: Starting year of coverage.
  • Active: Active or inactive.
  • In-Press: Articles in press or not.
  • ISO Language Code: Three-letter ISO 639 code for language.
  • ASJC Codes: All Science Journal Classification codes for the journal.

Rest of the features provide further details on the journal's subject area or category: - Life Sciences: Top level subject area. - Social Sciences: Top level subject area. - Physical Sciences: Top level subject area. - Health Sciences: Top level subject area. - 1000 General: ASJC main category. - 1100 Agricultural and Biological Sciences: ASJC main category. - 1200 Arts and Humanities: ASJC main category. - 1300 Biochemistry, Genetics and Molecular Biology: ASJC main category. - 1400 Business, Management and Accounting: ASJC main category. - 1500 Chemical Engineering: ASJC main category. - 1600 Chemistry: ASJC main category. - 1700 Computer Science: ASJC main category. - 1800 Decision Sciences: ASJC main category. - 1900 Earth and Planetary Sciences: ASJC main category. - 2000 Economics, Econometrics and Finance: ASJC main category. - 2100 Energy: ASJC main category. - 2200 Engineering: ASJC main category. - 2300 Environmental Science: ASJC main category. - 2400 Immunology and Microbiology: ASJC main category. - 2500 Materials Science: ASJC main category. - 2600 Mathematics: ASJC main category. - 2700 Medicine: ASJC main category. - 2800 Neuroscience: ASJC main category. - 2900 Nursing: ASJC main category. - 3000 Pharmacology, Toxicology and Pharmaceutics: ASJC main category. - 3100 Physics and Astronomy: ASJC main category. - 3200 Psychology: ASJC main category. - 3300 Social Sciences: ASJC main category. - 3400 Veterinary: ASJC main category. - 3500 Dentistry: ASJC main category. - 3600 Health Professions: ASJC main category.

Search
Clear search
Close search
Google apps
Main menu