62 datasets found
  1. SAS code used to analyze data and a datafile with metadata glossary

    • catalog.data.gov
    • data.amerigeoss.org
    Updated Nov 12, 2020
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    U.S. EPA Office of Research and Development (ORD) (2020). SAS code used to analyze data and a datafile with metadata glossary [Dataset]. https://catalog.data.gov/dataset/sas-code-used-to-analyze-data-and-a-datafile-with-metadata-glossary
    Explore at:
    Dataset updated
    Nov 12, 2020
    Dataset provided by
    United States Environmental Protection Agencyhttp://www.epa.gov/
    Description

    We compiled macroinvertebrate assemblage data collected from 1995 to 2014 from the St. Louis River Area of Concern (AOC) of western Lake Superior. Our objective was to define depth-adjusted cutoff values for benthos condition classes (poor, fair, reference) to provide tool useful for assessing progress toward achieving removal targets for the degraded benthos beneficial use impairment in the AOC. The relationship between depth and benthos metrics was wedge-shaped. We therefore used quantile regression to model the limiting effect of depth on selected benthos metrics, including taxa richness, percent non-oligochaete individuals, combined percent Ephemeroptera, Trichoptera, and Odonata individuals, and density of ephemerid mayfly nymphs (Hexagenia). We created a scaled trimetric index from the first three metrics. Metric values at or above the 90th percentile quantile regression model prediction were defined as reference condition for that depth. We set the cutoff between poor and fair condition as the 50th percentile model prediction. We examined sampler type, exposure, geographic zone of the AOC, and substrate type for confounding effects. Based on these analyses we combined data across sampler type and exposure classes and created separate models for each geographic zone. We used the resulting condition class cutoff values to assess the relative benthic condition for three habitat restoration project areas. The depth-limited pattern of ephemerid abundance we observed in the St. Louis River AOC also occurred elsewhere in the Great Lakes. We provide tabulated model predictions for application of our depth-adjusted condition class cutoff values to new sample data. This dataset is associated with the following publication: Angradi, T., W. Bartsch, A. Trebitz, V. Brady, and J. Launspach. A depth-adjusted ambient distribution approach for setting numeric removal targets for a Great Lakes Area of Concern beneficial use impairment: Degraded benthos. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 43(1): 108-120, (2017).

  2. SSMT SAS data set

    • figshare.com
    txt
    Updated Nov 26, 2021
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Thiago Bernardino (2021). SSMT SAS data set [Dataset]. http://doi.org/10.6084/m9.figshare.17086745.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Nov 26, 2021
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Thiago Bernardino
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    SAS PROC used to evaluate SSMT data

  3. SAS-2 Map Product Catalog - Dataset - NASA Open Data Portal

    • data.nasa.gov
    Updated Apr 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    nasa.gov (2025). SAS-2 Map Product Catalog - Dataset - NASA Open Data Portal [Dataset]. https://data.nasa.gov/dataset/sas-2-map-product-catalog
    Explore at:
    Dataset updated
    Apr 1, 2025
    Dataset provided by
    NASAhttp://nasa.gov/
    Description

    This database is a collection of maps created from the 28 SAS-2 observation files. The original observation files can be accessed within BROWSE by changing to the SAS2RAW database. For each of the SAS-2 observation files, the analysis package FADMAP was run and the resulting maps, plus GIF images created from these maps, were collected into this database. Each map is a 60 x 60 pixel FITS format image with 1 degree pixels. The user may reconstruct any of these maps within the captive account by running FADMAP from the command line after extracting a file from within the SAS2RAW database. The parameters used for selecting data for these product map files are embedded keywords in the FITS maps themselves. These parameters are set in FADMAP, and for the maps in this database are set as 'wide open' as possible. That is, except for selecting on each of 3 energy ranges, all other FADMAP parameters were set using broad criteria. To find more information about how to run FADMAP on the raw event's file, the user can access help files within the SAS2RAW database or can use the 'fhelp' facility from the command line to gain information about FADMAP. This is a service provided by NASA HEASARC .

  4. f

    Supplement 1. SAS code and data set for obtaining the results described in...

    • wiley.figshare.com
    html
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jay M. Ver Hoef; Peter L. Boveng (2023). Supplement 1. SAS code and data set for obtaining the results described in this paper. [Dataset]. http://doi.org/10.6084/m9.figshare.3528452.v1
    Explore at:
    htmlAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    Wiley
    Authors
    Jay M. Ver Hoef; Peter L. Boveng
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    File List NBvsPoi_FINAL.sas -- SAS code SSEAK98_FINAL.txt -- Harbor seal data used by SAS code Description The NBvsPoi_FINAL SAS program uses a SAS macro to analyze the data in SSEAK98_FINAL.txt. The SAS program and macro are commented for further explanation.

  5. e

    Eximpedia Export Import Trade

    • eximpedia.app
    Updated Feb 5, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Seair Exim (2025). Eximpedia Export Import Trade [Dataset]. https://www.eximpedia.app/
    Explore at:
    .bin, .xml, .csv, .xlsAvailable download formats
    Dataset updated
    Feb 5, 2025
    Dataset provided by
    Eximpedia PTE LTD
    Eximpedia Export Import Trade Data
    Authors
    Seair Exim
    Area covered
    Moldova (Republic of), Uganda, Nauru, Antarctica, Central African Republic, United States Minor Outlying Islands, Pakistan, Gibraltar, Virgin Islands (U.S.), South Africa
    Description

    Set In S A S Export Import Data. Follow the Eximpedia platform for HS code, importer-exporter records, and customs shipment details.

  6. d

    SAS-2 Map Product Catalog

    • catalog.data.gov
    • s.cnmilf.com
    Updated Sep 19, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    High Energy Astrophysics Science Archive Research Center (2025). SAS-2 Map Product Catalog [Dataset]. https://catalog.data.gov/dataset/sas-2-map-product-catalog
    Explore at:
    Dataset updated
    Sep 19, 2025
    Dataset provided by
    High Energy Astrophysics Science Archive Research Center
    Description

    This database is a collection of maps created from the 28 SAS-2 observation files. The original observation files can be accessed within BROWSE by changing to the SAS2RAW database. For each of the SAS-2 observation files, the analysis package FADMAP was run and the resulting maps, plus GIF images created from these maps, were collected into this database. Each map is a 60 x 60 pixel FITS format image with 1 degree pixels. The user may reconstruct any of these maps within the captive account by running FADMAP from the command line after extracting a file from within the SAS2RAW database. The parameters used for selecting data for these product map files are embedded keywords in the FITS maps themselves. These parameters are set in FADMAP, and for the maps in this database are set as 'wide open' as possible. That is, except for selecting on each of 3 energy ranges, all other FADMAP parameters were set using broad criteria. To find more information about how to run FADMAP on the raw event's file, the user can access help files within the SAS2RAW database or can use the 'fhelp' facility from the command line to gain information about FADMAP. This is a service provided by NASA HEASARC .

  7. v

    Set In Sas Company profile with phone,email, buyers, suppliers, price,...

    • volza.com
    csv
    Updated Nov 29, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Volza FZ LLC (2025). Set In Sas Company profile with phone,email, buyers, suppliers, price, export import shipments. [Dataset]. https://www.volza.com/company-profile/set-in-sas-24868788
    Explore at:
    csvAvailable download formats
    Dataset updated
    Nov 29, 2025
    Dataset authored and provided by
    Volza FZ LLC
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    2014 - Sep 30, 2021
    Variables measured
    Count of exporters, Count of importers, Sum of export value, Sum of import value, Count of export shipments, Count of import shipments
    Description

    Credit report of Set In Sas contains unique and detailed export import market intelligence with it's phone, email, Linkedin and details of each import and export shipment like product, quantity, price, buyer, supplier names, country and date of shipment.

  8. f

    SAS scripts for supplementary data.

    • datasetcatalog.nlm.nih.gov
    • figshare.com
    Updated Jul 13, 2015
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Geronimo, Jerome T.; Fletcher, Craig A.; Bellinger, Dwight A.; Whitaker, Julia; Vieira, Giovana; Garner, Joseph P.; George, Nneka M. (2015). SAS scripts for supplementary data. [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0001869731
    Explore at:
    Dataset updated
    Jul 13, 2015
    Authors
    Geronimo, Jerome T.; Fletcher, Craig A.; Bellinger, Dwight A.; Whitaker, Julia; Vieira, Giovana; Garner, Joseph P.; George, Nneka M.
    Description

    The raw data for each of the analyses are presented. Baseline severity difference (probands only) (Figure A in S1 Dataset), Repeated measures analysis of change in lesion severity (Figure B in S1 Dataset). Logistic regression of survivorship (Figure C in S1 Dataset). Time to cure (Figure D in S1 Dataset). Each data set is given as a SAS code for the data itself, and the equivalent analysis to that performed in JMP (and reported in the text). Data are presented in SAS format as this is a simple text format. The data and code were generated as direct exports from JMP, and additional SAS code added as needed (for instance, JMP does not export code for post-hoc tests). Note, however, that SAS rounds to less precision than JMP, and can give slightly different results, especially for REML methods. (DOCX)

  9. g

    Data Processing and Data Analysis with SAS (Exercise File)

    • dbk.gesis.org
    • da-ra.de
    Updated Apr 13, 2010
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Uehlinger, Hans-Martin (2010). Data Processing and Data Analysis with SAS (Exercise File) [Dataset]. http://doi.org/10.4232/1.1232
    Explore at:
    Dataset updated
    Apr 13, 2010
    Dataset provided by
    GESIS - Leibniz Institute for the Social Sciences
    Authors
    Uehlinger, Hans-Martin
    License

    https://dbk.gesis.org/dbksearch/sdesc2.asp?no=1232https://dbk.gesis.org/dbksearch/sdesc2.asp?no=1232

    Description

    Exercise data set for the SAS book by Uehlinger. Sample of individual variables and cases from the data set of ZA Study 0757 (political ideology).

    Topics: most important political problems of the country; political interest; party inclination; beha

  10. Supplement 1. SAS macro for adaptive cluster sampling and Aletris data sets...

    • wiley.figshare.com
    html
    Updated Jun 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Thomas Philippi (2023). Supplement 1. SAS macro for adaptive cluster sampling and Aletris data sets from the example. [Dataset]. http://doi.org/10.6084/m9.figshare.3524501.v1
    Explore at:
    htmlAvailable download formats
    Dataset updated
    Jun 1, 2023
    Dataset provided by
    Wileyhttps://www.wiley.com/
    Authors
    Thomas Philippi
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    File List ACS.zip -- .zip file containing SAS macro and example code, and example Aletris bracteata data sets. acs.sas chekika_ACS_estimation.sas chekika_1.csv chekika_2.csv philippi.3.1.zip

    Description "acs.sas" is a SAS macro for computing Horvitz-Thompson and Hansen-Horwitz estimates of population size for adaptive cluster sampling with random initial sampling. This version uses ugly base SAS code and does not require SQL or SAS products other than Base SAS, and should work with versions 8.2 onward (tested with versions 9.0 and 9.1). "chekika_ACS_estimation.sas" is example SAS code calling the acs macro to analyze the Chekika Aletris bracteata example data sets. "chekika_1.csv" is an example data set in ASCII comma-delimited format from adaptive cluster sampling of A. bracteata at Chekika, Everglades National Park, with 1-m2 quadrats. "chekika_2.csv" is an example data set in ASCII comma-delimited format from adaptive cluster sampling of A. bracteata at Chekika, Everglades National Park, with 4-m2 quadrats. "philippi.3.1.zip" metadata file generated by morpho, including both xml and css.

  11. Handwritten ASAP Short Answer Scoring

    • zenodo.org
    zip
    Updated Nov 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Christian Gold; Torsten Zesch; Christian Gold; Torsten Zesch (2023). Handwritten ASAP Short Answer Scoring [Dataset]. http://doi.org/10.5281/zenodo.8088866
    Explore at:
    zipAvailable download formats
    Dataset updated
    Nov 1, 2023
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Christian Gold; Torsten Zesch; Christian Gold; Torsten Zesch
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset is based on the Short Answer Scoring (SAS) dataset of the Automated Student Assessment Prize (ASAP).
    Although the original dataset was conducted on handwritten content, the scans are not available.
    To analyze the full pipeline from Handwritten Answers to Automated Scoring, we let students rewrite some answers.
    The texts used from SAS are from the test set and from the training set.

  12. f

    Supplement 1. The Linum data set and a text file containing instructions and...

    • datasetcatalog.nlm.nih.gov
    • wiley.figshare.com
    Updated Aug 5, 2016
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dale, Mark R. T.; Cahill, James F.; Lamb, Eric G. (2016). Supplement 1. The Linum data set and a text file containing instructions and SAS scripts for analysis of the data set. [Dataset]. https://datasetcatalog.nlm.nih.gov/dataset?q=0001520357
    Explore at:
    Dataset updated
    Aug 5, 2016
    Authors
    Dale, Mark R. T.; Cahill, James F.; Lamb, Eric G.
    Description

    File List Lamb_et_al_SAScode.txt Linumdata.txt Description The file Lamb_et_al_SAScode.txt contains SAS scripts and instructions for conducting nonlinear regression analyses of the Linum data set. The contents of the file can be pasted directly into the script editor in SAS. The file includes a script to import the Linum data set contained in the file Linumdata.txt into SAS. The file Linumdata.txt contains 4 columns and 40 rows (39 data points, one row with column headings). The columns in the data set are as follows: -- TABLE: Please see in attached file. --

  13. H

    Current Population Survey (CPS)

    • dataverse.harvard.edu
    • search.dataone.org
    Updated May 30, 2013
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Anthony Damico (2013). Current Population Survey (CPS) [Dataset]. http://doi.org/10.7910/DVN/AK4FDD
    Explore at:
    CroissantCroissant is a format for machine-learning datasets. Learn more about this at mlcommons.org/croissant.
    Dataset updated
    May 30, 2013
    Dataset provided by
    Harvard Dataverse
    Authors
    Anthony Damico
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    analyze the current population survey (cps) annual social and economic supplement (asec) with r the annual march cps-asec has been supplying the statistics for the census bureau's report on income, poverty, and health insurance coverage since 1948. wow. the us census bureau and the bureau of labor statistics ( bls) tag-team on this one. until the american community survey (acs) hit the scene in the early aughts (2000s), the current population survey had the largest sample size of all the annual general demographic data sets outside of the decennial census - about two hundred thousand respondents. this provides enough sample to conduct state- and a few large metro area-level analyses. your sample size will vanish if you start investigating subgroups b y state - consider pooling multiple years. county-level is a no-no. despite the american community survey's larger size, the cps-asec contains many more variables related to employment, sources of income, and insurance - and can be trended back to harry truman's presidency. aside from questions specifically asked about an annual experience (like income), many of the questions in this march data set should be t reated as point-in-time statistics. cps-asec generalizes to the united states non-institutional, non-active duty military population. the national bureau of economic research (nber) provides sas, spss, and stata importation scripts to create a rectangular file (rectangular data means only person-level records; household- and family-level information gets attached to each person). to import these files into r, the parse.SAScii function uses nber's sas code to determine how to import the fixed-width file, then RSQLite to put everything into a schnazzy database. you can try reading through the nber march 2012 sas importation code yourself, but it's a bit of a proc freak show. this new github repository contains three scripts: 2005-2012 asec - download all microdata.R down load the fixed-width file containing household, family, and person records import by separating this file into three tables, then merge 'em together at the person-level download the fixed-width file containing the person-level replicate weights merge the rectangular person-level file with the replicate weights, then store it in a sql database create a new variable - one - in the data table 2012 asec - analysis examples.R connect to the sql database created by the 'download all microdata' progr am create the complex sample survey object, using the replicate weights perform a boatload of analysis examples replicate census estimates - 2011.R connect to the sql database created by the 'download all microdata' program create the complex sample survey object, using the replicate weights match the sas output shown in the png file below 2011 asec replicate weight sas output.png statistic and standard error generated from the replicate-weighted example sas script contained in this census-provided person replicate weights usage instructions document. click here to view these three scripts for more detail about the current population survey - annual social and economic supplement (cps-asec), visit: the census bureau's current population survey page the bureau of labor statistics' current population survey page the current population survey's wikipedia article notes: interviews are conducted in march about experiences during the previous year. the file labeled 2012 includes information (income, work experience, health insurance) pertaining to 2011. when you use the current populat ion survey to talk about america, subract a year from the data file name. as of the 2010 file (the interview focusing on america during 2009), the cps-asec contains exciting new medical out-of-pocket spending variables most useful for supplemental (medical spending-adjusted) poverty research. confidential to sas, spss, stata, sudaan users: why are you still rubbing two sticks together after we've invented the butane lighter? time to transition to r. :D

  14. SAS / Excel Data set Karthik1

    • figshare.com
    bin
    Updated Jan 24, 2018
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Steven Juliano (2018). SAS / Excel Data set Karthik1 [Dataset]. http://doi.org/10.6084/m9.figshare.5745720.v1
    Explore at:
    binAvailable download formats
    Dataset updated
    Jan 24, 2018
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Steven Juliano
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This is the SAS 9.4 data set and the same data set as an excel file that is the basis and starting point for all analyses

  15. e

    Set Gad Sas Export Import Data | Eximpedia

    • eximpedia.app
    Updated Oct 17, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    (2025). Set Gad Sas Export Import Data | Eximpedia [Dataset]. https://www.eximpedia.app/companies/set-gad-sas/76544837
    Explore at:
    Dataset updated
    Oct 17, 2025
    Description

    Set Gad Sas Export Import Data. Follow the Eximpedia platform for HS code, importer-exporter records, and customs shipment details.

  16. G

    SAS Controller Market Research Report 2033

    • growthmarketreports.com
    csv, pdf, pptx
    Updated Sep 1, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Growth Market Reports (2025). SAS Controller Market Research Report 2033 [Dataset]. https://growthmarketreports.com/report/sas-controller-market
    Explore at:
    pdf, pptx, csvAvailable download formats
    Dataset updated
    Sep 1, 2025
    Dataset authored and provided by
    Growth Market Reports
    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    SAS Controller Market Outlook



    According to our latest research, the global SAS Controller market size in 2024 is valued at USD 2.68 billion, driven by the escalating demand for high-performance data storage solutions across diverse sectors. The market is set to witness robust expansion at a CAGR of 6.7% from 2025 to 2033. By the end of 2033, the SAS Controller market is forecasted to reach a valuation of USD 4.88 billion. This growth trajectory is primarily attributed to the increasing adoption of cloud computing, big data analytics, and the proliferation of enterprise applications that require reliable and scalable storage infrastructures.




    The growth of the SAS Controller market is significantly influenced by the rising demand for advanced data storage technologies in enterprise environments. As organizations continue to generate and process massive volumes of data, the need for robust storage management solutions becomes paramount. SAS controllers, with their ability to offer high-speed data transfer, enhanced scalability, and superior reliability, are becoming the preferred choice over traditional storage interfaces. The rapid adoption of virtualization and cloud-based services further amplifies the need for efficient storage architectures, thereby fueling the demand for SAS controllers across various industry verticals. Moreover, the evolution of data center infrastructure and the shift towards hyper-converged systems are expected to drive sustained investments in SAS controller solutions over the coming years.




    Another key growth factor for the SAS Controller market is the increasing deployment of servers and storage systems in sectors such as BFSI, healthcare, and manufacturing. These industries require seamless data access, secure storage, and high availability to support mission-critical applications. SAS controllers play a vital role in ensuring data integrity and optimizing storage performance, especially in environments where downtime can result in significant financial losses or compromise sensitive information. The growing digital transformation initiatives across both public and private sectors are creating new opportunities for SAS controller vendors to offer innovative products that cater to evolving storage requirements, including support for higher data rates and integration with hybrid storage architectures.




    Technological advancements in SAS controller design, such as the integration of RAID functionalities, enhanced error correction capabilities, and support for next-generation SAS protocols, are also contributing to market growth. Vendors are focusing on developing controllers that can handle increasing data workloads while maintaining energy efficiency and minimizing latency. The emergence of NVMe and SSD-based storage solutions is prompting SAS controller manufacturers to innovate and offer products that provide seamless interoperability and future-proofing for enterprise storage environments. Additionally, the trend towards distributed and edge computing is expected to create further demand for SAS controllers that can deliver high performance in decentralized storage architectures.




    From a regional perspective, North America remains the dominant market for SAS controllers, owing to the presence of major technology companies, advanced IT infrastructure, and the early adoption of innovative storage solutions. However, the Asia Pacific region is witnessing the fastest growth, driven by rapid industrialization, increasing investments in data centers, and the expansion of cloud services. Europe and Latin America are also showing steady growth, supported by digitalization initiatives in various industries. The Middle East & Africa region, although still emerging, presents significant potential as enterprises in the region ramp up their investments in IT modernization and storage infrastructure.



    In the context of technological advancements, the integration of RAID-on-Chip technology within SAS controllers is gaining traction. This innovation allows for the consolidation of RAID functionalities directly onto the controller chip, enhancing performance and reducing latency. RAID-on-Chip solutions offer improved data protection and reliability, which are critical in environments that demand high availability and fault tolerance. As enterprises continue to seek ways

  17. g

    Datenverarbeitung und Datenanalyse mit SAS (Übungsdatei)

    • search.gesis.org
    • da-ra.de
    Updated Apr 13, 2010
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Uehlinger, Hans-Martin (2010). Datenverarbeitung und Datenanalyse mit SAS (Übungsdatei) [Dataset]. http://doi.org/10.4232/1.1232
    Explore at:
    Dataset updated
    Apr 13, 2010
    Dataset provided by
    GESIS Data Archive
    GESIS search
    Authors
    Uehlinger, Hans-Martin
    License

    https://www.gesis.org/en/institute/data-usage-termshttps://www.gesis.org/en/institute/data-usage-terms

    Description

    Exercise data set for the SAS book by Uehlinger. Sample of individual variables and cases from the data set of ZA Study 0757 (political ideology).

    Topics: most important political problems of the country; political interest; party inclination; behavior at the polls in the Federal Parliament election 1972; political participation and willingness to participate in political protests.

    Demography: age; sex; marital status; religious denomination; school education; interest in politics; party preference.

  18. v

    Representaciones Artisticas Jet Set Sas Company profile with phone,email,...

    • volza.com
    csv
    Updated Nov 14, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Volza FZ LLC (2025). Representaciones Artisticas Jet Set Sas Company profile with phone,email, buyers, suppliers, price, export import shipments. [Dataset]. https://www.volza.com/company-profile/representaciones-artisticas-jet-set-sas-25352939
    Explore at:
    csvAvailable download formats
    Dataset updated
    Nov 14, 2025
    Dataset authored and provided by
    Volza FZ LLC
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Time period covered
    2014 - Sep 30, 2021
    Variables measured
    Count of exporters, Count of importers, Sum of export value, Sum of import value, Count of export shipments, Count of import shipments
    Description

    Credit report of Representaciones Artisticas Jet Set Sas contains unique and detailed export import market intelligence with it's phone, email, Linkedin and details of each import and export shipment like product, quantity, price, buyer, supplier names, country and date of shipment.

  19. d

    Health and Retirement Study (HRS)

    • search.dataone.org
    • dataverse.harvard.edu
    Updated Nov 21, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Damico, Anthony (2023). Health and Retirement Study (HRS) [Dataset]. http://doi.org/10.7910/DVN/ELEKOY
    Explore at:
    Dataset updated
    Nov 21, 2023
    Dataset provided by
    Harvard Dataverse
    Authors
    Damico, Anthony
    Description

    analyze the health and retirement study (hrs) with r the hrs is the one and only longitudinal survey of american seniors. with a panel starting its third decade, the current pool of respondents includes older folks who have been interviewed every two years as far back as 1992. unlike cross-sectional or shorter panel surveys, respondents keep responding until, well, death d o us part. paid for by the national institute on aging and administered by the university of michigan's institute for social research, if you apply for an interviewer job with them, i hope you like werther's original. figuring out how to analyze this data set might trigger your fight-or-flight synapses if you just start clicking arou nd on michigan's website. instead, read pages numbered 10-17 (pdf pages 12-19) of this introduction pdf and don't touch the data until you understand figure a-3 on that last page. if you start enjoying yourself, here's the whole book. after that, it's time to register for access to the (free) data. keep your username and password handy, you'll need it for the top of the download automation r script. next, look at this data flowchart to get an idea of why the data download page is such a righteous jungle. but wait, good news: umich recently farmed out its data management to the rand corporation, who promptly constructed a giant consolidated file with one record per respondent across the whole panel. oh so beautiful. the rand hrs files make much of the older data and syntax examples obsolete, so when you come across stuff like instructions on how to merge years, you can happily ignore them - rand has done it for you. the health and retirement study only includes noninstitutionalized adults when new respondents get added to the panel (as they were in 1992, 1993, 1998, 2004, and 2010) but once they're in, they're in - respondents have a weight of zero for interview waves when they were nursing home residents; but they're still responding and will continue to contribute to your statistics so long as you're generalizing about a population from a previous wave (for example: it's possible to compute "among all americans who were 50+ years old in 1998, x% lived in nursing homes by 2010"). my source for that 411? page 13 of the design doc. wicked. this new github repository contains five scripts: 1992 - 2010 download HRS microdata.R loop through every year and every file, download, then unzip everything in one big party impor t longitudinal RAND contributed files.R create a SQLite database (.db) on the local disk load the rand, rand-cams, and both rand-family files into the database (.db) in chunks (to prevent overloading ram) longitudinal RAND - analysis examples.R connect to the sql database created by the 'import longitudinal RAND contributed files' program create tw o database-backed complex sample survey object, using a taylor-series linearization design perform a mountain of analysis examples with wave weights from two different points in the panel import example HRS file.R load a fixed-width file using only the sas importation script directly into ram with < a href="http://blog.revolutionanalytics.com/2012/07/importing-public-data-with-sas-instructions-into-r.html">SAScii parse through the IF block at the bottom of the sas importation script, blank out a number of variables save the file as an R data file (.rda) for fast loading later replicate 2002 regression.R connect to the sql database created by the 'import longitudinal RAND contributed files' program create a database-backed complex sample survey object, using a taylor-series linearization design exactly match the final regression shown in this document provided by analysts at RAND as an update of the regression on pdf page B76 of this document . click here to view these five scripts for more detail about the health and retirement study (hrs), visit: michigan's hrs homepage rand's hrs homepage the hrs wikipedia page a running list of publications using hrs notes: exemplary work making it this far. as a reward, here's the detailed codebook for the main rand hrs file. note that rand also creates 'flat files' for every survey wave, but really, most every analysis you c an think of is possible using just the four files imported with the rand importation script above. if you must work with the non-rand files, there's an example of how to import a single hrs (umich-created) file, but if you wish to import more than one, you'll have to write some for loops yourself. confidential to sas, spss, stata, and sudaan users: a tidal wave is coming. you can get water up your nose and be dragged out to sea, or you can grab a surf board. time to transition to r. :D

  20. data set with SAS code for analyses that are reported in Table 1 of paper

    • figshare.com
    txt
    Updated Jul 16, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Steven Juliano (2024). data set with SAS code for analyses that are reported in Table 1 of paper [Dataset]. http://doi.org/10.6084/m9.figshare.26314546.v1
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jul 16, 2024
    Dataset provided by
    Figsharehttp://figshare.com/
    figshare
    Authors
    Steven Juliano
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This is the data set with SAS code that yielded the analyses in Table 1 of the paper.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
U.S. EPA Office of Research and Development (ORD) (2020). SAS code used to analyze data and a datafile with metadata glossary [Dataset]. https://catalog.data.gov/dataset/sas-code-used-to-analyze-data-and-a-datafile-with-metadata-glossary
Organization logo

SAS code used to analyze data and a datafile with metadata glossary

Explore at:
Dataset updated
Nov 12, 2020
Dataset provided by
United States Environmental Protection Agencyhttp://www.epa.gov/
Description

We compiled macroinvertebrate assemblage data collected from 1995 to 2014 from the St. Louis River Area of Concern (AOC) of western Lake Superior. Our objective was to define depth-adjusted cutoff values for benthos condition classes (poor, fair, reference) to provide tool useful for assessing progress toward achieving removal targets for the degraded benthos beneficial use impairment in the AOC. The relationship between depth and benthos metrics was wedge-shaped. We therefore used quantile regression to model the limiting effect of depth on selected benthos metrics, including taxa richness, percent non-oligochaete individuals, combined percent Ephemeroptera, Trichoptera, and Odonata individuals, and density of ephemerid mayfly nymphs (Hexagenia). We created a scaled trimetric index from the first three metrics. Metric values at or above the 90th percentile quantile regression model prediction were defined as reference condition for that depth. We set the cutoff between poor and fair condition as the 50th percentile model prediction. We examined sampler type, exposure, geographic zone of the AOC, and substrate type for confounding effects. Based on these analyses we combined data across sampler type and exposure classes and created separate models for each geographic zone. We used the resulting condition class cutoff values to assess the relative benthic condition for three habitat restoration project areas. The depth-limited pattern of ephemerid abundance we observed in the St. Louis River AOC also occurred elsewhere in the Great Lakes. We provide tabulated model predictions for application of our depth-adjusted condition class cutoff values to new sample data. This dataset is associated with the following publication: Angradi, T., W. Bartsch, A. Trebitz, V. Brady, and J. Launspach. A depth-adjusted ambient distribution approach for setting numeric removal targets for a Great Lakes Area of Concern beneficial use impairment: Degraded benthos. JOURNAL OF GREAT LAKES RESEARCH. International Association for Great Lakes Research, Ann Arbor, MI, USA, 43(1): 108-120, (2017).

Search
Clear search
Close search
Google apps
Main menu