100+ datasets found
  1. d

    October 2023 data-update for "Updated science-wide author databases of...

    • elsevier.digitalcommonsdata.com
    Updated Oct 4, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    John P.A. Ioannidis (2023). October 2023 data-update for "Updated science-wide author databases of standardized citation indicators" [Dataset]. http://doi.org/10.17632/btchxktzyw.6
    Explore at:
    Dataset updated
    Oct 4, 2023
    Authors
    John P.A. Ioannidis
    License

    Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
    License information was derived automatically

    Description

    Citation metrics are widely used and misused. We have created a publicly available database of top-cited scientists that provides standardized information on citations, h-index, co-authorship adjusted hm-index, citations to papers in different authorship positions and a composite indicator (c-score). Separate data are shown for career-long and, separately, for single recent year impact. Metrics with and without self-citations and ratio of citations to citing papers are given. Scientists are classified into 22 scientific fields and 174 sub-fields according to the standard Science-Metrix classification. Field- and subfield-specific percentiles are also provided for all scientists with at least 5 papers. Career-long data are updated to end-of-2022 and single recent year data pertain to citations received during calendar year 2022. The selection is based on the top 100,000 scientists by c-score (with and without self-citations) or a percentile rank of 2% or above in the sub-field. This version (6) is based on the October 1, 2023 snapshot from Scopus, updated to end of citation year 2022. This work uses Scopus data provided by Elsevier through ICSR Lab (https://www.elsevier.com/icsr/icsrlab). Calculations were performed using all Scopus author profiles as of October 1, 2023. If an author is not on the list it is simply because the composite indicator value was not high enough to appear on the list. It does not mean that the author does not do good work.

    PLEASE ALSO NOTE THAT THE DATABASE HAS BEEN PUBLISHED IN AN ARCHIVAL FORM AND WILL NOT BE CHANGED. The published version reflects Scopus author profiles at the time of calculation. We thus advise authors to ensure that their Scopus profiles are accurate. REQUESTS FOR CORRECIONS OF THE SCOPUS DATA (INCLUDING CORRECTIONS IN AFFILIATIONS) SHOULD NOT BE SENT TO US. They should be sent directly to Scopus, preferably by use of the Scopus to ORCID feedback wizard (https://orcid.scopusfeedback.com/) so that the correct data can be used in any future annual updates of the citation indicator databases.

    The c-score focuses on impact (citations) rather than productivity (number of publications) and it also incorporates information on co-authorship and author positions (single, first, last author). If you have additional questions, please read the 3 associated PLoS Biology papers that explain the development, validation and use of these metrics and databases. (https://doi.org/10.1371/journal.pbio.1002501, https://doi.org/10.1371/journal.pbio.3000384 and https://doi.org/10.1371/journal.pbio.3000918).

    Finally, we alert users that all citation metrics have limitations and their use should be tempered and judicious. For more reading, we refer to the Leiden manifesto: https://www.nature.com/articles/520429a

  2. n

    Scopus

    • neuinfo.org
    • dknet.org
    • +2more
    Updated Jul 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2022). Scopus [Dataset]. http://identifiers.org/RRID:SCR_022559
    Explore at:
    Dataset updated
    Jul 15, 2022
    Description

    Abstract and indexing database with full text links that is produced by Elsevier Co. Combines expertly curated abstract and citation database with enriched data and linked scholarly literature across wide variety of disciplines.

  3. COUNTRIES Research & Science Dataset - SCImagoJR

    • kaggle.com
    zip
    Updated Apr 10, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Ali Jalaali (2025). COUNTRIES Research & Science Dataset - SCImagoJR [Dataset]. https://www.kaggle.com/datasets/alijalali4ai/scimago-country-info-and-rank
    Explore at:
    zip(54895151 bytes)Available download formats
    Dataset updated
    Apr 10, 2025
    Authors
    Ali Jalaali
    Description


    The SCImago Journal & Country Rank is a publicly available portal that includes the journals and country scientific indicators developed from the information contained in the Scopus® database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains. Country rankings may also be compared or analysed separately.

    ✅Collected by: SCImagoJR Country Data Collector Notebook

    💬Also have a look at
    💡 UNIVERSITIES & Research INSTITUTIONS Rank - SCImagoIR
    💡 Scientific JOURNALS Indicators & Info - SCImagoJR

    • 27 major thematic subject areas as well as 309 specific subject categories according to Scopus® Classification.
    • Citation data is drawn from over 34,100 titles from more than 5,000 international publishers
    • SCImago is a research group from the Consejo Superior de Investigaciones Científicas (CSIC), University of Granada, Extremadura, Carlos III (Madrid) and Alcalá de Henares, dedicated to information analysis, representation and retrieval by means of visualisation techniques.

    ☢️❓The entire dataset is obtained from public and open-access data of ScimagoJR (SCImago Journal & Country Rank)
    ScimagoJR Country Rank
    SCImagoJR About Us

    Available indicators:

    • Documents: Number of documents published during the selected year. It is usually called the country's scientific output.

    • Citable Documents: Selected year citable documents. Exclusively articles, reviews and conference papers are considered.

    • Citations: Number of citations by the documents published during the source year, --i.e. citations in years X, X+1, X+2, X+3... to documents published during year X. When referred to the period 1996-2021, all published documents during this period are considered.

    • Citations per Document: Average citations per document published during the source year, --i.e. citations in years X, X+1, X+2, X+3... to documents published during year X. When referred to the period 1996-2021, all published documents during this period are considered.

    • Self Citations: Country self-citations. Number of self-citations of all dates received by the documents published during the source year, --i.e. self-citations in years X, X+1, X+2, X+3... to documents published during year X. When referred to the period 1996-2021, all published documents during this period are considered.

    • H index: The h index is a country's number of articles (h) that have received at least h- citations. It quantifies both country's scientific productivity and scientific impact and it is also applicable to scientists, journals, etc.

  4. h

    Scimago Journal Rankings

    • hgxjs.org
    • search.webdepozit.sk
    • +5more
    csv
    Updated Oct 7, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Scimago Lab (2024). Scimago Journal Rankings [Dataset]. http://hgxjs.org/journalrank0138.html
    Explore at:
    csvAvailable download formats
    Dataset updated
    Oct 7, 2024
    Dataset authored and provided by
    Scimago Lab
    Description

    Academic journals indicators developed from the information contained in the Scopus database (Elsevier B.V.). These indicators can be used to assess and analyze scientific domains.

  5. m

    Scopus source title list: aggregated data (2011-2018)

    • data.mendeley.com
    • narcis.nl
    Updated Aug 26, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Frederique Bordignon (2019). Scopus source title list: aggregated data (2011-2018) [Dataset]. http://doi.org/10.17632/855x2zwjd2.1
    Explore at:
    Dataset updated
    Aug 26, 2019
    Authors
    Frederique Bordignon
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Scopus coverage updates (2011-2018) Data aggregated from Elsevier title list files; it provides: - source ID in Scopus - Title - ISSN - ESSN - Year of the title list file used as source - SNIP - Open access status - Status in the database (Added, previously indexed) - Filed - Subfield - ASJC code

  6. Elsevier Memento data

    • figshare.com
    application/gzip
    Updated Jan 19, 2016
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Martin Klein (2016). Elsevier Memento data [Dataset]. http://doi.org/10.6084/m9.figshare.1132675.v1
    Explore at:
    application/gzipAvailable download formats
    Dataset updated
    Jan 19, 2016
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Martin Klein
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This file contains Memento information for all URIs from the Elsevier dataset.

  7. w

    Distribution of books by Elsevier Academic by publication date

    • workwithdata.com
    Updated Apr 17, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Work With Data (2025). Distribution of books by Elsevier Academic by publication date [Dataset]. https://www.workwithdata.com/charts/books?agg=count&chart=bar&f=1&fcol0=book_publisher&fop0=%3D&fval0=Elsevier+Academic&x=publication_date&y=records
    Explore at:
    Dataset updated
    Apr 17, 2025
    Dataset authored and provided by
    Work With Data
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This bar chart displays books by publication date using the aggregation count. The data is filtered where the book publisher is Elsevier Academic. The data is about books.

  8. e

    elsevier.com Traffic Analytics Data

    • analytics.explodingtopics.com
    Updated Oct 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). elsevier.com Traffic Analytics Data [Dataset]. https://analytics.explodingtopics.com/website/elsevier.com
    Explore at:
    Dataset updated
    Oct 1, 2025
    Variables measured
    Global Rank, Monthly Visits, Authority Score, US Country Rank, Publishing Category Rank
    Description

    Traffic analytics, rankings, and competitive metrics for elsevier.com as of October 2025

  9. e

    Elsevier Limited Export Import Data | Eximpedia

    • eximpedia.app
    Updated Jan 7, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Elsevier Limited Export Import Data | Eximpedia [Dataset]. https://www.eximpedia.app/companies/elsevier-limited/82117368
    Explore at:
    Dataset updated
    Jan 7, 2025
    Description

    Elsevier Limited Export Import Data. Follow the Eximpedia platform for HS code, importer-exporter records, and customs shipment details.

  10. u

    Data from: The University of California’s Split with Elsevier

    • hsscommons.rs-dev.uvic.ca
    • hsscommons.ca
    Updated Oct 23, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Caroline Winter (2023). The University of California’s Split with Elsevier [Dataset]. http://doi.org/10.80230/FZ3Q-HK03
    Explore at:
    Dataset updated
    Oct 23, 2023
    Dataset provided by
    Canadian HSS Commons
    Authors
    Caroline Winter
    Description

    On February 28, 2019, the University of California (UC) announced that it would not renew its subscriptions to Elsevier journals. UC is a public research university in California, USA, with 10 campuses across the state.

  11. f

    2021 Elsevier Journal Title Level Pricing for Seven U.S. Research...

    • iastate.figshare.com
    pdf
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Curtis Brundy; Joel Thornton (2023). 2021 Elsevier Journal Title Level Pricing for Seven U.S. Research Universities [Dataset]. http://doi.org/10.25380/iastate.19742416.v1
    Explore at:
    pdfAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    Iowa State University
    Authors
    Curtis Brundy; Joel Thornton
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This data includes 2021 Elsevier title level pricing for seven universities, including:

    Florida State University Iowa State University University of North Carolina, Chapel Hill West Virginia University Purdue University University of Virginia an anonymous university named “Institution A”

    In addition, the data includes a summary analysis that includes for each university the 2021 Published List Price, Adjustment from List Price, Average Cost per Journal, and number of Subscribed Titles. This data will be of interest to anyone interested in examining title level pricing from a major commercial publisher.

  12. I

    Scopus API Scripts for Data Reuse Project

    • databank.illinois.edu
    Updated Apr 26, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    William Mischo (2021). Scopus API Scripts for Data Reuse Project [Dataset]. http://doi.org/10.13012/B2IDB-0988473_V1
    Explore at:
    Dataset updated
    Apr 26, 2021
    Authors
    William Mischo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    To generate the bibliographic and survey data to support a data reuse study conducted by several Library faculty and accepted for publication in the Journal of Academic Librarianship, the project team utilized a series of web-based online scripts that employed several different endpoints from the Scopus API. The related dataset: "Data for: An Examination of Data Reuse Practices within Highly Cited Articles of Faculty at a Research University" contains survey design and results.
    1) getScopus_API_process_dmp_IDB.asp: used the search API query the Scopus database API for papers by UIUC authors published in 2015 -- limited to one of 9 pre-defined Scopus subject areas -- and retrieve metadata results sorted highest to lowest by the number of times the retrieved articles were cited. The URL for the basic searches took the following form: https://api.elsevier.com/content/search/scopus?query=(AFFIL%28(urbana%20OR%20champaign) AND univ*%29) OR (AF-ID(60000745) OR AF-ID(60005290))&apikey=xxxxxx&start=" & nstart & "&count=25&date=2015&view=COMPLETE&sort=citedby-count&subj=PHYS
    Here, the variable nstart was incremented by 25 each iteration and 25 records were retrieved in each pass. The subject area was renamed (e.g. from PHYS to COMP for computer science) in each of the 9 runs. This script does not use the Scopus API cursor but downloads 25 records at a time for up to 28 times -- or 675 maximum bibliographic records. The project team felt that looking at the most 675 cited articles from UIUC faculty in each of the 9 subject areas was sufficient to gather a robust, representative sample of articles from 2015. These downloaded records were stored in a temporary table that was renamed for each of the 9 subject areas.
    2) get_citing_from_surveys_IDB.asp: takes a Scopus article ID (eid) from the 49 UIUC author returned surveys and retrieves short citing article references, 200 at a time, into a temporary composite table. These citing records contain only one author, no author affiliations, and no author email addresses. This script uses the Scopus API cursor=* feature and is able to download all the citing references of an article 200 records at a time.
    3) put_in_all_authors_affil_IDB.asp: adds important data to the short citing records. The script adds all co-authors and their affiliations, the corresponding author, and author email addresses.
    4) process_for_final_IDB.asp: creates a relational database table with author, title, and source journal information for each of the citing articles that can be copied as an Excel file for processing by the Qualtrics survey software. This was initially 4,626 citing articles over the 49 UIUC authored articles, but was reduced to 2,041 entries after checking for available email addresses and eliminating duplicates.

  13. World Top 2% Scientists 2021 Database

    • kaggle.com
    zip
    Updated Nov 14, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    dasmehdixtr (2021). World Top 2% Scientists 2021 Database [Dataset]. https://www.kaggle.com/datasets/dasmehdixtr/world-top-2-scientists-2021-database/discussion
    Explore at:
    zip(147071648 bytes)Available download formats
    Dataset updated
    Nov 14, 2021
    Authors
    dasmehdixtr
    Area covered
    World
    Description

    August 2021 data-update for "Updated science-wide author databases of standardized citation indicators"

    Description

    Citation metrics are widely used and misused. We have created a publicly available database of over 100,000 top-scientists that provides standardized information on citations, h-index, co-authorship adjusted hm-index, citations to papers in different authorship positions and a composite indicator. Separate data are shown for career-long and single year impact. Metrics with and without self-citations and ratio of citations to citing papers are given. Scientists are classified into 22 scientific fields and 176 sub-fields. Field- and subfield-specific percentiles are also provided for all scientists who have published at least 5 papers. Career-long data are updated to end-of-2020. The selection is based on the top 100,000 by c-score (with and without self-citations) or a percentile rank of 2% or above.

    The dataset and code provides an update to previously released version 1 data under https://doi.org/10.17632/btchxktzyw.1; The version 2 dataset is based on the May 06, 2020 snapshot from Scopus and is updated to citation year 2019 available at https://doi.org/10.17632/btchxktzyw.2

    This version (3) is based on the Aug 01, 2021 snapshot from Scopus and is updated to citation year 2020.

    Baas, Jeroen; Boyack, Kevin; Ioannidis, John P.A. (2021), “August 2021 data-update for "Updated science-wide author databases of standardized citation indicators"”, Mendeley Data, V3, doi: 10.17632/btchxktzyw.3

    For more please visit here.

  14. d

    Data from: Elsevier OA CC-BY Corpus

    • elsevier.digitalcommonsdata.com
    Updated Aug 3, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Daniel Kershaw (2020). Elsevier OA CC-BY Corpus [Dataset]. http://doi.org/10.17632/zm33cdndxs.1
    Explore at:
    Dataset updated
    Aug 3, 2020
    Authors
    Daniel Kershaw
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This is a corpus of 40k (40,001) open access (OA) CC-BY articles from across Elsevier’s journals represent the first cross-discipline research of data at this scale to support NLP and ML research.

    This dataset was released to support the development of ML and NLP models targeting science articles from across all research domains. While the release builds on other datasets designed for specific domains and tasks, it will allow for similar datasets to be derived or for the development of models which can be applied and tested across domains.

  15. n

    Data of top 50 most cited articles about COVID-19 and the complications of...

    • data.niaid.nih.gov
    • search.dataone.org
    • +1more
    zip
    Updated Jan 10, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Tanya Singh; Jagadish Rao Padubidri; Pavanchand Shetty H; Matthew Antony Manoj; Therese Mary; Bhanu Thejaswi Pallempati (2024). Data of top 50 most cited articles about COVID-19 and the complications of COVID-19 [Dataset]. http://doi.org/10.5061/dryad.tx95x6b4m
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 10, 2024
    Dataset provided by
    Kasturba Medical College, Mangalore
    Authors
    Tanya Singh; Jagadish Rao Padubidri; Pavanchand Shetty H; Matthew Antony Manoj; Therese Mary; Bhanu Thejaswi Pallempati
    License

    https://spdx.org/licenses/CC0-1.0.htmlhttps://spdx.org/licenses/CC0-1.0.html

    Description

    Background This bibliometric analysis examines the top 50 most-cited articles on COVID-19 complications, offering insights into the multifaceted impact of the virus. Since its emergence in Wuhan in December 2019, COVID-19 has evolved into a global health crisis, with over 770 million confirmed cases and 6.9 million deaths as of September 2023. Initially recognized as a respiratory illness causing pneumonia and ARDS, its diverse complications extend to cardiovascular, gastrointestinal, renal, hematological, neurological, endocrinological, ophthalmological, hepatobiliary, and dermatological systems. Methods Identifying the top 50 articles from a pool of 5940 in Scopus, the analysis spans November 2019 to July 2021, employing terms related to COVID-19 and complications. Rigorous review criteria excluded non-relevant studies, basic science research, and animal models. The authors independently reviewed articles, considering factors like title, citations, publication year, journal, impact factor, authors, study details, and patient demographics. Results The focus is primarily on 2020 publications (96%), with all articles being open-access. Leading journals include The Lancet, NEJM, and JAMA, with prominent contributions from Internal Medicine (46.9%) and Pulmonary Medicine (14.5%). China played a major role (34.9%), followed by France and Belgium. Clinical features were the primary study topic (68%), often utilizing retrospective designs (24%). Among 22,477 patients analyzed, 54.8% were male, with the most common age group being 26–65 years (63.2%). Complications affected 13.9% of patients, with a recovery rate of 57.8%. Conclusion Analyzing these top-cited articles offers clinicians and researchers a comprehensive, timely understanding of influential COVID-19 literature. This approach uncovers attributes contributing to high citations and provides authors with valuable insights for crafting impactful research. As a strategic tool, this analysis facilitates staying updated and making meaningful contributions to the dynamic field of COVID-19 research. Methods A bibliometric analysis of the most cited articles about COVID-19 complications was conducted in July 2021 using all journals indexed in Elsevier’s Scopus and Thomas Reuter’s Web of Science from November 1, 2019 to July 1, 2021. All journals were selected for inclusion regardless of country of origin, language, medical speciality, or electronic availability of articles or abstracts. The terms were combined as follows: (“COVID-19” OR “COVID19” OR “SARS-COV-2” OR “SARSCOV2” OR “SARS 2” OR “Novel coronavirus” OR “2019-nCov” OR “Coronavirus”) AND (“Complication” OR “Long Term Complication” OR “Post-Intensive Care Syndrome” OR “Venous Thromboembolism” OR “Acute Kidney Injury” OR “Acute Liver Injury” OR “Post COVID-19 Syndrome” OR “Acute Cardiac Injury” OR “Cardiac Arrest” OR “Stroke” OR “Embolism” OR “Septic Shock” OR “Disseminated Intravascular Coagulation” OR “Secondary Infection” OR “Blood Clots” OR “Cytokine Release Syndrome” OR “Paediatric Inflammatory Multisystem Syndrome” OR “Vaccine Induced Thrombosis with Thrombocytopenia Syndrome” OR “Aspergillosis” OR “Mucormycosis” OR “Autoimmune Thrombocytopenia Anaemia” OR “Immune Thrombocytopenia” OR “Subacute Thyroiditis” OR “Acute Respiratory Failure” OR “Acute Respiratory Distress Syndrome” OR “Pneumonia” OR “Subcutaneous Emphysema” OR “Pneumothorax” OR “Pneumomediastinum” OR “Encephalopathy” OR “Pancreatitis” OR “Chronic Fatigue” OR “Rhabdomyolysis” OR “Neurologic Complication” OR “Cardiovascular Complications” OR “Psychiatric Complication” OR “Respiratory Complication” OR “Cardiac Complication” OR “Vascular Complication” OR “Renal Complication” OR “Gastrointestinal Complication” OR “Haematological Complication” OR “Hepatobiliary Complication” OR “Musculoskeletal Complication” OR “Genitourinary Complication” OR “Otorhinolaryngology Complication” OR “Dermatological Complication” OR “Paediatric Complication” OR “Geriatric Complication” OR “Pregnancy Complication”) in the Title, Abstract or Keyword. A total of 5940 articles were accessed, of which the top 50 most cited articles about COVID-19 and Complications of COVID-19 were selected through Scopus. Each article was reviewed for its appropriateness for inclusion. The articles were independently reviewed by three researchers (JRP, MAM and TS) (Table 1). Differences in opinion with regard to article inclusion were resolved by consensus. The inclusion criteria specified articles that were focused on COVID-19 and Complications of COVID-19. Articles were excluded if they did not relate to COVID-19 and or complications of COVID-19, Basic Science Research and studies using animal models or phantoms. Review articles, Viewpoints, Guidelines, Perspectives and Meta-analysis were also excluded from the top 50 most-cited articles (Table 1). The top 50 most-cited articles were compiled in a single database and the relevant data was extracted. The database included: Article Title, Scopus Citations, Year of Publication, Journal, Journal Impact Factor, Authors, Number of Authors, Department Affiliation, Number of Institutions, Country of Origin, Study Topic, Study Design, Sample Size, Open Access, Non-Original Articles, Patient/Participants Age, Gender, Symptoms, Signs, Co-morbidities, Complications, Imaging Modalities Used and outcome.

  16. f

    FAIRsharing record for: Elsevier - The Lancet - Information for Authors

    • fairsharing.org
    Updated Jun 2, 2016
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2016). FAIRsharing record for: Elsevier - The Lancet - Information for Authors [Dataset]. http://doi.org/10.25504/FAIRsharing.8fyd11
    Explore at:
    Dataset updated
    Jun 2, 2016
    License

    Attribution-ShareAlike 4.0 (CC BY-SA 4.0)https://creativecommons.org/licenses/by-sa/4.0/
    License information was derived automatically

    Description

    This FAIRsharing record describes: The Information for Authors page contains a broad range of guidelines for publishing in The Lancet. With regards to data deposition, novel gene sequences should be deposited in a public database (GenBank, EMBL, or DDBJ), and the accession number provided. Authors of microarray papers should include in their submission the information recommended by the MIAME guidelines. Authors should also submit their experimental details to one of the publicly available databases: ArrayExpress or GEO.

  17. r

    Polymer engineering and science Impact Factor 2024-2025 - ResearchHelpDesk

    • researchhelpdesk.org
    Updated Feb 23, 2022
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Research Help Desk (2022). Polymer engineering and science Impact Factor 2024-2025 - ResearchHelpDesk [Dataset]. https://www.researchhelpdesk.org/journal/impact-factor-if/458/polymer-engineering-and-science
    Explore at:
    Dataset updated
    Feb 23, 2022
    Dataset authored and provided by
    Research Help Desk
    Description

    Polymer engineering and science Impact Factor 2024-2025 - ResearchHelpDesk - Polymer engineering and science - Every day, the Society of Plastics Engineers (SPE) takes action to help companies in the plastics industry succeed. How? By spreading knowledge, strengthening skills and promoting plastics. Employing these vital strategies, Polymer engineering and science - SPE has helped the plastics industry thrive for over 60 years. In the process, we've developed a 25,000-member network of leading engineers and other plastics professionals, including technicians, salespeople, marketers, retailers, and representatives from tertiary industries. For more than 30 years, Polymer Engineering & Science has been one of the most highly regarded journals in the field, serving as a forum for authors of treatises on the cutting edge of polymer science and technology. The importance of PE&S is underscored by the frequent rate at which its articles are cited, especially by other publications - literally thousands of times a year. Engineers, researchers, technicians, and academicians worldwide are looking to PE&S for the valuable information they need. There are special issues compiled by distinguished guest editors. These contain proceedings of symposia on such diverse topics as polyblends, mechanics of plastics and polymer welding. Abstracting and Indexing Information Academic ASAP (GALE Cengage) Advanced Technologies & Aerospace Database (ProQuest) Applied Science & Technology Index/Abstracts (EBSCO Publishing) CAS: Chemical Abstracts Service (ACS) CCR Database (Clarivate Analytics) Chemical Abstracts Service/SciFinder (ACS) Chemistry Server Reaction Center (Clarivate Analytics) ChemWeb (ChemIndustry.com) Chimica Database (Elsevier) COMPENDEX (Elsevier) Current Contents: Engineering, Computing & Technology (Clarivate Analytics) Current Contents: Physical, Chemical & Earth Sciences (Clarivate Analytics) Expanded Academic ASAP (GALE Cengage) InfoTrac (GALE Cengage) Journal Citation Reports/Science Edition (Clarivate Analytics) Materials Science & Engineering Database (ProQuest) PASCAL Database (INIST/CNRS) Polymer Library (iSmithers RAPRA) ProQuest Central (ProQuest) ProQuest Central K-462 Reaction Citation Index (Clarivate Analytics) Research Library (ProQuest) Research Library Prep (ProQuest) Science Citation Index (Clarivate Analytics) Science Citation Index Expanded (Clarivate Analytics) SciTech Premium Collection (ProQuest) SCOPUS (Elsevier) STEM Database (ProQuest) Technology Collection (ProQuest) Web of Science (Clarivate Analytics)

  18. w

    Distribution of books by Saunders/ Elsevier by publication date

    • workwithdata.com
    Updated Apr 17, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Work With Data (2025). Distribution of books by Saunders/ Elsevier by publication date [Dataset]. https://www.workwithdata.com/charts/books?agg=count&chart=bar&f=1&fcol0=book_publisher&fop0=%3D&fval0=Saunders%2F+Elsevier&x=publication_date&y=records
    Explore at:
    Dataset updated
    Apr 17, 2025
    Dataset authored and provided by
    Work With Data
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This bar chart displays books by publication date using the aggregation count. The data is filtered where the book publisher is Saunders/ Elsevier. The data is about books.

  19. n

    ThermoML Representation of Published Experimental Data from Thermochimica...

    • trc.nist.gov
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Thermodynamics Research Center, ThermoML Representation of Published Experimental Data from Thermochimica Acta (0040-6031, Elsevier) [Dataset]. http://doi.org/10.1016/j.tca.2008.09.004.html
    Explore at:
    Dataset authored and provided by
    Thermodynamics Research Center
    License

    https://www.nist.gov/open/licensehttps://www.nist.gov/open/license

    Description

    This dataset contains links to ThermoML files, which represent experimental thermophysical and thermochemical property data reported in the corresponding articles published by major journals in the field. These files are posted here through cooperation between the Thermodynamics Research Center (TRC) at the National Institute of Standards and Technology (NIST) and Elsevier. The ThermoML files corresponding to articles in the journals are available here with permission of the journal publishers.

  20. Z

    Dataset: Shell Commands Used by Participants of Hands-on Cybersecurity...

    • data.niaid.nih.gov
    • data-staging.niaid.nih.gov
    Updated Jul 18, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Valdemar Švábenský; Jan Vykopal; Pavel Seda; Pavel Čeleda (2023). Dataset: Shell Commands Used by Participants of Hands-on Cybersecurity Training [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_5137354
    Explore at:
    Dataset updated
    Jul 18, 2023
    Dataset provided by
    Masaryk University
    Authors
    Valdemar Švábenský; Jan Vykopal; Pavel Seda; Pavel Čeleda
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This repository contains supplementary materials for the following journal paper:

    Valdemar Švábenský, Jan Vykopal, Pavel Seda, Pavel Čeleda. Dataset of Shell Commands Used by Participants of Hands-on Cybersecurity Training. In Elsevier Data in Brief. 2021. https://doi.org/10.1016/j.dib.2021.107398

    How to cite

    If you use or build upon the materials, please use the BibTeX entry below to cite the original paper (not only this web link).

    @article{Svabensky2021dataset, author = {\v{S}v\'{a}bensk\'{y}, Valdemar and Vykopal, Jan and Seda, Pavel and \v{C}eleda, Pavel}, title = {{Dataset of Shell Commands Used by Participants of Hands-on Cybersecurity Training}}, journal = {Data in Brief}, publisher = {Elsevier}, volume = {38}, year = {2021}, issn = {2352-3409}, url = {https://doi.org/10.1016/j.dib.2021.107398}, doi = {10.1016/j.dib.2021.107398}, }

    The data were collected using a logging toolset referenced here.

    Attached content

    Dataset (data.zip). The collected data are attached here on Zenodo. A copy is also available in this repository.

    Analytical tools (toolset.zip). To analyze the data, you can instantiate the toolset or this project for ELK.

    Version history

    Version 1 (https://zenodo.org/record/5137355) contains 13446 log records from 175 trainees. These data are precisely those that are described in the associated journal paper. Version 1 provides a snapshot of the state when the article was published.

    Version 2 (https://zenodo.org/record/5517479) contains 13446 log records from 175 trainees. The data are unchanged from Version 1, but the analytical toolset includes a minor fix.

    Version 3 (https://zenodo.org/record/6670113) contains 21762 log records from 275 trainees. It is a superset of Version 2, with newly collected data added to the dataset.

    The current Version 4 (https://zenodo.org/record/8136017) contains 21459 log records from 275 trainees. Compared to Version 3, we cleaned 303 invalid/duplicate command records.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
John P.A. Ioannidis (2023). October 2023 data-update for "Updated science-wide author databases of standardized citation indicators" [Dataset]. http://doi.org/10.17632/btchxktzyw.6

October 2023 data-update for "Updated science-wide author databases of standardized citation indicators"

Explore at:
70 scholarly articles cite this dataset (View in Google Scholar)
Dataset updated
Oct 4, 2023
Authors
John P.A. Ioannidis
License

Attribution-NonCommercial 3.0 (CC BY-NC 3.0)https://creativecommons.org/licenses/by-nc/3.0/
License information was derived automatically

Description

Citation metrics are widely used and misused. We have created a publicly available database of top-cited scientists that provides standardized information on citations, h-index, co-authorship adjusted hm-index, citations to papers in different authorship positions and a composite indicator (c-score). Separate data are shown for career-long and, separately, for single recent year impact. Metrics with and without self-citations and ratio of citations to citing papers are given. Scientists are classified into 22 scientific fields and 174 sub-fields according to the standard Science-Metrix classification. Field- and subfield-specific percentiles are also provided for all scientists with at least 5 papers. Career-long data are updated to end-of-2022 and single recent year data pertain to citations received during calendar year 2022. The selection is based on the top 100,000 scientists by c-score (with and without self-citations) or a percentile rank of 2% or above in the sub-field. This version (6) is based on the October 1, 2023 snapshot from Scopus, updated to end of citation year 2022. This work uses Scopus data provided by Elsevier through ICSR Lab (https://www.elsevier.com/icsr/icsrlab). Calculations were performed using all Scopus author profiles as of October 1, 2023. If an author is not on the list it is simply because the composite indicator value was not high enough to appear on the list. It does not mean that the author does not do good work.

PLEASE ALSO NOTE THAT THE DATABASE HAS BEEN PUBLISHED IN AN ARCHIVAL FORM AND WILL NOT BE CHANGED. The published version reflects Scopus author profiles at the time of calculation. We thus advise authors to ensure that their Scopus profiles are accurate. REQUESTS FOR CORRECIONS OF THE SCOPUS DATA (INCLUDING CORRECTIONS IN AFFILIATIONS) SHOULD NOT BE SENT TO US. They should be sent directly to Scopus, preferably by use of the Scopus to ORCID feedback wizard (https://orcid.scopusfeedback.com/) so that the correct data can be used in any future annual updates of the citation indicator databases.

The c-score focuses on impact (citations) rather than productivity (number of publications) and it also incorporates information on co-authorship and author positions (single, first, last author). If you have additional questions, please read the 3 associated PLoS Biology papers that explain the development, validation and use of these metrics and databases. (https://doi.org/10.1371/journal.pbio.1002501, https://doi.org/10.1371/journal.pbio.3000384 and https://doi.org/10.1371/journal.pbio.3000918).

Finally, we alert users that all citation metrics have limitations and their use should be tempered and judicious. For more reading, we refer to the Leiden manifesto: https://www.nature.com/articles/520429a

Search
Clear search
Close search
Google apps
Main menu