12 datasets found
  1. File Compression Tools Market Report | Global Forecast From 2025 To 2033

    • dataintelo.com
    csv, pdf, pptx
    Updated Jan 7, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dataintelo (2025). File Compression Tools Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/file-compression-tools-market
    Explore at:
    pptx, pdf, csvAvailable download formats
    Dataset updated
    Jan 7, 2025
    Dataset authored and provided by
    Dataintelo
    License

    https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

    Time period covered
    2024 - 2032
    Area covered
    Global
    Description

    File Compression Tools Market Outlook



    The global file compression tools market size is projected to reach USD XX billion by 2032 from USD XX billion in 2023, growing at a CAGR of XX% during the forecast period. The steady growth in the market size is driven by the increasing digitization across various industries, which necessitates efficient data management solutions like file compression tools.



    The growing volume of digital data generated daily is one of the primary growth factors for the file compression tools market. As businesses and individuals increasingly rely on digital platforms for communication and data storage, the need for efficient file compression tools becomes paramount. These tools help in saving storage space and reducing the time and bandwidth required for data transfer, thus enhancing overall productivity. Furthermore, the rise of cloud computing and the need for seamless data transfer between cloud and on-premises environments further boost the demand for advanced file compression solutions.



    Another significant growth driver is the increasing adoption of high-definition content. With the proliferation of 4K and 8K videos, high-resolution images, and other large digital files, there is a growing need for robust file compression tools that can handle large file sizes without significant loss of quality. This trend is particularly prominent in the media and entertainment industry, which requires efficient compression tools to manage and distribute high-quality content swiftly. Additionally, the rising use of big data analytics across various sectors also contributes to the increased demand for file compression tools, as they help in efficiently managing and processing large datasets.



    Technological advancements in compression algorithms are also propelling the growth of the file compression tools market. Modern compression techniques offer superior compression ratios and faster processing speeds, making them more efficient and reliable. The integration of artificial intelligence and machine learning algorithms in compression tools further enhances their performance, enabling more intelligent and adaptive compression strategies. This continuous innovation ensures that file compression tools remain relevant and capable of meeting the evolving needs of users.



    In the context of file compression tools, the role of a Compression Driver is increasingly becoming pivotal. A Compression Driver is essentially a software component or a set of algorithms that manage the compression and decompression processes, ensuring optimal performance and efficiency. These drivers are crucial for maintaining the balance between compression speed and the quality of the compressed files. As data volumes continue to grow, the demand for more sophisticated Compression Drivers that can handle large datasets without compromising on speed or quality is on the rise. This is particularly important for industries that require real-time data processing and transmission, such as telecommunications and finance.



    Regionally, North America dominates the file compression tools market, driven by the presence of major technology companies and high adoption rates of advanced digital solutions. The Asia Pacific region, however, is expected to witness the highest growth rate during the forecast period. The rapid digitization of economies, increasing internet penetration, and the proliferation of smartphones and other digital devices in countries like China and India are significant contributing factors. Europe also represents a substantial market, with a strong focus on data protection and efficient data management solutions.



    Type Analysis



    The file compression tools market is segmented into lossless compression and lossy compression. Lossless compression algorithms enable the complete restoration of the original file without any loss of data. This type of compression is particularly crucial for industries where data integrity is paramount, such as healthcare and BFSI. Lossless compression tools are widely used for compressing text files, databases, and other critical data that must remain unaltered. The increasing emphasis on data security and integrity is driving the demand for lossless compression solutions in these sectors.



    In contrast, lossy compression algorithms achieve higher compression ratios by discarding some amount of data, which is generally imperceptible to human senses. This type of compression is ideal for media files su

  2. f

    Data from: Launching Vehicle Acoustic Data Compression Study Using Lossy...

    • scielo.figshare.com
    jpeg
    Updated May 31, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Guilherme Coelho da Silva Stanisce Corrêa; Rogério Pirk; Marcelo da Silva Pinho (2023). Launching Vehicle Acoustic Data Compression Study Using Lossy Audio Formats [Dataset]. http://doi.org/10.6084/m9.figshare.14291402.v1
    Explore at:
    jpegAvailable download formats
    Dataset updated
    May 31, 2023
    Dataset provided by
    SciELO journals
    Authors
    Guilherme Coelho da Silva Stanisce Corrêa; Rogério Pirk; Marcelo da Silva Pinho
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    ABSTRACT: The field of data compression has evolved over the last decades. In this way, several techniques to reduce the amount of acquired data from the sensor required to be transmitted have been developed. Those techniques are usually classified by lossless or lossy, where, for the lossless techniques, all acquired data is recovered, while the lossy techniques introduce errors to these data. Each of these techniques presents advantages and drawbacks, being the analyst responsible for choosing the appropriate technique for a specific application. This work presents a comparative study using lossy audio formats to be applied on a launch vehicle on-board acoustic data. The Opus format achieved a higher compression rate in comparison with standard compression techniques by saving up to 254 times the required amount of data to be transmitted through a telemetry link on launcher vehicle, and the lowest discrepancy from original data measured by the mean square error metric.

  3. Data from: Less is more: on-board lossy compression of accelerometer data...

    • zenodo.org
    • data.niaid.nih.gov
    • +1more
    csv
    Updated Jun 2, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Rascha Nuijten; Rascha Nuijten; Theo Gerrits; Judy Shamoun-Baranes; Bart Nolet; Theo Gerrits; Judy Shamoun-Baranes; Bart Nolet (2022). Less is more: on-board lossy compression of accelerometer data increases biologging capacity [Dataset]. http://doi.org/10.5061/dryad.6djh9w0x9
    Explore at:
    csvAvailable download formats
    Dataset updated
    Jun 2, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Rascha Nuijten; Rascha Nuijten; Theo Gerrits; Judy Shamoun-Baranes; Bart Nolet; Theo Gerrits; Judy Shamoun-Baranes; Bart Nolet
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description
    1. GPS-tracking devices have been used in combination with a wide range of additional sensors to study animal behaviour, physiology and interaction with their environment. Tri-axial accelerometers allow researchers to remotely infer the behaviour of individuals, at all places and times. Collection of accelerometer data is relatively cheap in terms of energy usage, but the amount or raw data collected generally requires much storage space and is particularly demanding in terms of energy needed for data transmission.
    2. Here we propose compressing the raw ACC data into summary statistics within the tracking device (before transmission) to reduce data size, as a means to overcome limitations in storage and energy capacity.
    3. We explored this type of lossy data compression in the accelerometer data of tagged Bewick's swans (Cygnus columbianus bewickii) collected in spring 2017. By using software settings in which bouts of 2 s of both raw ACC data and summary statistics were collected in parallel but with different bout intervals to keep total data size comparable, we created the opportunity for a direct comparison of time budgets derived by the two data collection methods.
    4. We found that the data compression in our case yielded a 6 time reduction in data size per bout, and concurrent, similar decreases in storage and energy use of the device. We show that with the same accuracy of the behavioural classification, the freed memory and energy of the device can be used to increase the monitoring effort, resulting in a more detailed representation of the individuals' time budget. Rare and/or short behaviours such as daily roost flights, were picked up significantly more when collecting summary statistics instead of raw ACC data (but note differences in sampling rate). Such level of detail can be of essential importance, for instance to make a reliable estimate of the energy budgets of individuals.
    5. In conclusion, we argue that this type of lossy data compression can be a well-considered choice in study situations where limitations in energy and storage space of the device pose a problem. Ultimately these developments can allow for long-term and nearly continuous remote-monitoring of the behaviour of free-ranging animals.
  4. R

    X-ray scan of soft ball compression experiments - raw data - 3D DIC data -...

    • entrepot.recherche.data.gouv.fr
    7z, bin, txt
    Updated Sep 1, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jonathan Barés; Jonathan Barés; Manuel Cárdenas-Barrantes; Manuel Cárdenas-Barrantes; Gustavo Pinzon; Gustavo Pinzon; Edward Andò; Edward Andò; Mathieu Renouf; Mathieu Renouf; Gioacchino Viggiani; Gioacchino Viggiani; Émilien Azéma; Émilien Azéma (2023). X-ray scan of soft ball compression experiments - raw data - 3D DIC data - post-processed data [Dataset]. http://doi.org/10.57745/MXEMI4
    Explore at:
    bin(47185920000), bin(12828153317), txt(5539), 7z(25756361299), 7z(22518449369), bin(38257584183)Available download formats
    Dataset updated
    Sep 1, 2023
    Dataset provided by
    Recherche Data Gouv
    Authors
    Jonathan Barés; Jonathan Barés; Manuel Cárdenas-Barrantes; Manuel Cárdenas-Barrantes; Gustavo Pinzon; Gustavo Pinzon; Edward Andò; Edward Andò; Mathieu Renouf; Mathieu Renouf; Gioacchino Viggiani; Gioacchino Viggiani; Émilien Azéma; Émilien Azéma
    License

    https://spdx.org/licenses/etalab-2.0.htmlhttps://spdx.org/licenses/etalab-2.0.html

    Description

    These data are associated with the results presented in the paper: Compacting an assembly of soft balls far beyond the jammed state: insights from 3D imaging They are scans of compressed millimetric silicon balls. Micro glass beads are trapped in the silicone so that 3D DIC can be performed. Post-processed data including displacement fields, strain fields, contacts to name a few are also available. More details about the experimental protocol and data post-processing can be found in this publication. How data are sorted The raw and post-processed data of 4 experiments are available here. For each experiment: - a 'scan' folder includes 'scan_XX' folders where 'XX' corresponds to the N compression steps. Inside each of these folders you can find 8-bit png pictures corresponding to the vertical slices of the density matrix of a given compression step. Not interesting slices, because particles are not seeable have been removed for the sake of saving space. - a 'result' folder contains all the data post-processed from the density images. More specifically: - 'pressure_kpa.txt' is a N vector giving the evolution of the applied pressure (in kPa) on the loading piston, where N is number of loading steps. - 'particle_number.txt' is a n vector telling to which particle a correlation cell belongs. n is the number of correlation cells. - 'particle_size.txt' is a m vector where m is the number of particle in the system. It gives the particle size : 1 for large particles, 0 for small ones. Particle numbering corresponds with 'particle_number.txt' - Following text files are Nxn matrices where N is the number of steps and n is the number of correlations cells. They give for each correlation cell, the evolution of an observable measured in the corresponding volume of the correlation cell: - 'position_i.txt' is the position of the cell along i axis - 'position_j.txt' is the position of the cell along j axis - 'position_k.txt' is the position of the cell along k axis - 'correlation.txt' is the evolution of the correlation value when performing the 3D DIC. This constitutes the goodness of measurement of the correlation cell positions - 'dgt_Fij.txt' is the evolution of the deformation gradient tensor for each of its ij components - 'energy.txt' is the evolution of the energy density stored in the material - 'no_outlier_energy.txt' is a boolean ginving, from the energy density measurement, if the observables can be considered as an outlier (0 value) or not (1 value) - Following text files are mxN matrices with self-expicit contents where N is the number of loading steps and m the number of grains (particle numbering corresponds with 'particle_number.txt'). They give for each grain, the evolution of an observable measured at the grain scale. The major direction is the direction in which the particle is the longest. The minor direction is the direction in which the particle is the shortest. Theta and phi are the azimutal and elevation angle respectively: - 'particle_asphericity.txt' - 'particle_area.txt' - 'minor_direction_theta.txt' - 'minor_direction_phi.txt' - 'minor_direction_length.txt' - 'major_direction_theta.txt' - 'major_direction_phi.txt' - 'major_direction_length.txt' - Following text files are N vectors with self-expicit contents where N is the number of loading steps. They give the evolution of a system observable during loading. If a second vector is given it is the evolution of the standard deviation of the observable. In the case of contacts 'proximity' et for contacts obtained only from proximity criterion and 'density' is for contacts obtained from scanner density criterion. 'std' stand for standard deviation: - 'global strain.txt' measured from the system boundaries evolution - 'packing_fraction.txt' measured from the system boundaries and particle volume evolution - 'average_contact_surface_proximity.txt' - 'average_contact_surface_density.txt' - 'average_contact_radius_proximity.txt' - 'average_contact_densitt_proximity.txt' - 'average_contact_outofplane_proximity.txt' - 'average_contact_direction_proximity.txt' - 'average_contact_direction_density.txt' - 'average_contact_asphericity_proximity.txt' - 'average_contact_asphericity_density.txt' - 'average_vonMises_strain.txt' - 'std_vonMises_strain.txt' - 'average contact direction_density.txt' - 'average_energy.txt' - 'std_energy.txt' - 'contact_proximity.txt' number of contact - 'contact_density.txt' number of contact - a 'contact_density' folder includes 'XX' folders corresponing to the N compression steps. Each of these 'XX' folders includes 'ijkP_AA_BB.txt' files which gives information about potential contact points between AA and BB grains. For each potential contact, 'ijkP_AA_BB.txt' gives the i,j and k position of the potential contact points in AA and the average local density value associated which gives the probability of contact. - a 'contact_proximity' folder includes 'XX' folders corresponing to the N compression steps. Each of these 'XX'...

  5. 4

    UbxLogger: U-blox ZED-F9P GNSS logging scripts for OpenWrt routers and...

    • data.4tu.nl
    zip
    Updated Oct 18, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Hans van der Marel (2024). UbxLogger: U-blox ZED-F9P GNSS logging scripts for OpenWrt routers and Single Board Computers. [Dataset]. http://doi.org/10.4121/889fb86b-8b32-4b93-9689-f04b3d3c2571.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Oct 18, 2024
    Dataset provided by
    4TU.ResearchData
    Authors
    Hans van der Marel
    License

    https://www.apache.org/licenses/LICENSE-2.0.htmlhttps://www.apache.org/licenses/LICENSE-2.0.html

    Description

    UbxLogger is a suite of shell scripts and executables for logging data from a U-blox ZED-F9P low cost GNSS receiver on OpenWrt routers and Single Board Computers such as the Raspberry Pi. Some of the things you can do with UbxLogger are


    • Log data from one or more U-blox ZED-F9P receivers to a micro SD card, USB stick and/or disk partition
    • Compress the data and save to an archive directory
    • Optionally push the compressed data to a remote server over the Internet (requires LAN, WAN or 4-G connectivity)
    • Optionally convert the data to RINEX version 3 files, at a selectable sample rate and interval, compress using Hatanaka compression and gzip, archive and/or push to a remote server.
    • Start on (re)boot, monitoring and restart


    You have have the choice to create compressed RINEX files on OpenWrt (or SBC) and push the RINEX to the remote server, and/or push ubx rawdata files to the remote server and convert to RINEX on the remote server. To transfer the compressed RINEX files, especially at a lower sample rate, requires only a fraction of the bandwith compared to ubx.


    UbxLogger is designed to run on power efficient OpenWrt routers and Single Board Computers and is written entirely in shell script with a few pre-compiled c executables . It is known to work with


    • The GL-iNet X750V2 (Spitz) OpenWrt 4G router
    • Raspberry Pi single board computer and Teltonika RUT240 4G router


    The total power consumption on the GL-iNet Spitz is below 3W, making this an ideal platform for solar powered operation.

  6. Z

    Data from: An evaluation of compression algorithms applied to moving object...

    • data.niaid.nih.gov
    • zenodo.org
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Baldo, Fabiano (2020). An evaluation of compression algorithms applied to moving object trajectories [Dataset]. https://data.niaid.nih.gov/resources?id=zenodo_3467011
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Yoran, Leichsenring
    Baldo, Fabiano
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This file contains the dataset, source code and results presented in the paper entitled "An evaluation of compression algorithms applied to moving object trajectories" published in the International Journal of Geographical Information Science in 2019.

    Abstract: The amount of spatiotemporal data collected by gadgets is rapidly growing, resulting in increasing costs to transfer, process and store it. In an attempt to minimize these costs several algorithms were proposed to reduce the trajectory size. However, to choose the right algorithm depends on a careful analysis of the application scenario. Therefore, this paper evaluates seven general purpose lossy compression algorithms in terms of structural aspects and performance characteristics, regarding four transportation modes: Bike, Bus, Car and Walk. The lossy compression algorithms evaluated are: Douglas-Peucker (DP), Opening-Window (OW), Dead-Reckoning (DR), Top-Down Time-Ratio (TS), Opening-Window Time-Ratio (OS), STTrace (ST) and SQUISH (SQ). Pareto Efficiency analysis pointed out that there is no best algorithm for all assessed characteristics, but rather DP applied less error and kept length better-preserved, OW kept speed better-preserved, ST kept acceleration better-preserved and DR spent less execution time. Another important finding is that algorithms that use metrics that do not keep time information have performed quite well even with characteristics time-dependent like speed and acceleration. Finally, it is possible to see that DR had the most suitable performance in general, being among the three best algorithms in four of the five assessed performance characteristics.

  7. S

    Global Data Compression Software Market Risk Analysis 2025-2032

    • statsndata.org
    excel, pdf
    Updated May 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stats N Data (2025). Global Data Compression Software Market Risk Analysis 2025-2032 [Dataset]. https://www.statsndata.org/report/data-compression-software-market-49490
    Explore at:
    pdf, excelAvailable download formats
    Dataset updated
    May 2025
    Dataset authored and provided by
    Stats N Data
    License

    https://www.statsndata.org/how-to-orderhttps://www.statsndata.org/how-to-order

    Area covered
    Global
    Description

    The Data Compression Software market has become a fundamental component of the digital landscape, enabling businesses and individuals to manage and optimize their data more effectively. At its core, data compression software reduces the size of files and datasets, which not only saves storage space but also enhances

  8. Global Compression Bags Market Global Trade Dynamics 2025-2032

    • statsndata.org
    excel, pdf
    Updated May 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Stats N Data (2025). Global Compression Bags Market Global Trade Dynamics 2025-2032 [Dataset]. https://www.statsndata.org/report/compression-bags-market-245637
    Explore at:
    excel, pdfAvailable download formats
    Dataset updated
    May 2025
    Dataset authored and provided by
    Stats N Data
    License

    https://www.statsndata.org/how-to-orderhttps://www.statsndata.org/how-to-order

    Area covered
    Global
    Description

    The Compression Bags market has emerged as a significant segment in the packaging industry, providing practical solutions for both consumers and businesses looking to save space and enhance storage efficiency. Compression bags have gained popularity due to their ability to reduce the volume of clothing, bedding, and

  9. NFIMM (NIST Fingerprint Image Metadata Modifier) software written in C++...

    • datasets.ai
    • data.nist.gov
    47, 57
    Updated Feb 26, 2024
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    National Institute of Standards and Technology (2024). NFIMM (NIST Fingerprint Image Metadata Modifier) software written in C++ which updates image (file) metadata information for BMP and PNG compression formats leaving original image file unchanged [Dataset]. https://datasets.ai/datasets/nfimm-nist-fingerprint-image-metadata-modifier-software-written-in-c-which-updates-image-f
    Explore at:
    57, 47Available download formats
    Dataset updated
    Feb 26, 2024
    Dataset authored and provided by
    National Institute of Standards and Technologyhttp://www.nist.gov/
    Description

    NFIMM is a software library written in C++ and runs on all the major computer platforms, for example, linux. It updates the header metadata for BMP and PNG image files while leaving the image data unchanged.The image and the metadata to be updated is supplied by the NFIMM using-software:- Required - source image file path or bytes-stream - source image compression format - destination image sample rate - destination image sample rate units- Optional - source image sample rate and units - PNG custom text: Description, Author, Creation Time, etc.Note that the destination image compression format is constrained to be the same as the source image and therefore is not a required input parameter.NFIMM inputs the source image from the using-software either as a path to the file or as the contents of a buffer already in memory. It makes the modified image available to the user in a new buffer or provides a method for the user to save the modified image to disk.NFIMM may be used stand-alone to update an image file when the resolution information is known to be incorrect.NFIMM was developed specifically to support NFIR (NIST Fingerprint Image Resampler) executable. NFIR requires the target image resolution value to perform the resampling; this resolution value is used to update the image metadata.

  10. I

    Intermittent Pneumatic Compression Pump Report

    • datainsightsmarket.com
    doc, pdf, ppt
    Updated May 19, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Data Insights Market (2025). Intermittent Pneumatic Compression Pump Report [Dataset]. https://www.datainsightsmarket.com/reports/intermittent-pneumatic-compression-pump-991360
    Explore at:
    ppt, pdf, docAvailable download formats
    Dataset updated
    May 19, 2025
    Dataset authored and provided by
    Data Insights Market
    License

    https://www.datainsightsmarket.com/privacy-policyhttps://www.datainsightsmarket.com/privacy-policy

    Time period covered
    2025 - 2033
    Area covered
    Global
    Variables measured
    Market Size
    Description

    The global Intermittent Pneumatic Compression (IPC) Pump market is experiencing robust growth, driven by the increasing prevalence of venous insufficiency, edema, and post-surgical rehabilitation needs. The market, currently valued in the billions (a precise figure requires the missing market size data, but considering similar medical device markets, a reasonable estimate would be in the range of $2-3 billion in 2025), is projected to witness a significant Compound Annual Growth Rate (CAGR) over the forecast period (2025-2033). Key growth drivers include the aging global population, rising healthcare expenditure, technological advancements leading to more portable and user-friendly devices, and an increased awareness among healthcare professionals and patients about the benefits of IPC therapy for preventing deep vein thrombosis (DVT) and improving lymphatic drainage. The market is segmented by application (hospitals and clinics, rehabilitation centers, household use) and type (desktop and vertical IPC pumps), with hospitals and clinics currently dominating the application segment due to higher adoption rates. The desktop segment holds a larger market share compared to the vertical segment, primarily due to its cost-effectiveness and ease of use. However, the vertical segment is expected to witness faster growth driven by increasing demand for compact and space-saving devices, particularly in home healthcare settings. Geographic segmentation reveals strong market presence in North America and Europe, propelled by well-established healthcare infrastructure and higher disposable incomes. However, emerging economies in Asia-Pacific are showing substantial growth potential, fueled by rising healthcare investments and increasing awareness of IPC therapy. While the market enjoys significant growth prospects, certain restraints exist. These include the high initial cost of IPC pumps, particularly for advanced models, which can limit accessibility in low-income regions. Furthermore, the potential for adverse effects, although rare, needs to be addressed through robust patient education and proper device usage guidelines. Competitive intensity is another factor to consider, with several established players and emerging companies vying for market share. The long-term growth trajectory is positive, however, suggesting a promising future for manufacturers who can successfully navigate these challenges by focusing on innovation, cost-effectiveness, and targeted market expansion strategies. Successful players will likely focus on integrating smart technologies, enhancing user-friendliness, and establishing strong distribution networks to cater to the diverse needs of healthcare providers and individual patients.

  11. Data from: Improving Network Efficiency with Simplemux

    • zenodo.org
    • data.niaid.nih.gov
    zip
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jose Saldana; Ignacio Forcén; Julián Fernández-Navajas; José Ruiz-Mas; Jose Saldana; Ignacio Forcén; Julián Fernández-Navajas; José Ruiz-Mas (2020). Improving Network Efficiency with Simplemux [Dataset]. http://doi.org/10.5281/zenodo.35246
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Jose Saldana; Ignacio Forcén; Julián Fernández-Navajas; José Ruiz-Mas; Jose Saldana; Ignacio Forcén; Julián Fernández-Navajas; José Ruiz-Mas
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains the open data related to the research paper:

    Jose Saldana, Ignacio Forcen, Julian Fernandez-Navajas, Jose Ruiz-Mas, "Improving Network Efficiency with Simplemux,'' IEEE CIT 2015, International Conference on Computer and Information Technology, 26-28 October 2015 in Liverpool, UK.

    This work has been partially financed by the EU H2020 Wi-5 project (Grant Agreement no: 644262), and European Social Fund in collaboration with the Government of Aragon.

    Paper Abstract—The high amount of small packets currently transported by IP networks results in a high overhead, caused by the significant header-to-payload ratio of these packets. In addition, the MAC layer of wireless technologies makes a non-optimal use of airtime when packets are small. Small packets are also costly in terms of processing capacity. This paper presents Simplemux, a protocol able to multiplex a number of packets sharing a common network path, thus increasing efficiency when small packets are transported. It can be useful in constrained scenarios where resources are scarce, as community wireless networks or IoT. Simplemux can be seen as an alternative to Layer-2 optimization, already available in 802.11 networks. The design of Simplemux is presented, and its efficiency improvement is analyzed. An implementation is used to carry out some tests with real traffic, showing significant improvements: 46% of the bandwidth can be saved when compressing voice traffic; the reduction in terms of packets per second in an Internet trace can be up to 50%. In wireless networks, packet grouping results in a significantly improved use of air time.

  12. High Temperature Compression Studies of a Zr-2.5Nb Alloy using Deformation...

    • zenodo.org
    • data.niaid.nih.gov
    zip
    Updated Feb 15, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Christopher Stuart Daniel; Christopher Stuart Daniel; Christian J. Peyton; João Quinta da Fonseca; João Quinta da Fonseca; Christian J. Peyton (2022). High Temperature Compression Studies of a Zr-2.5Nb Alloy using Deformation Dilatometer [Dataset]. http://doi.org/10.5281/zenodo.3374512
    Explore at:
    zipAvailable download formats
    Dataset updated
    Feb 15, 2022
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Christopher Stuart Daniel; Christopher Stuart Daniel; Christian J. Peyton; João Quinta da Fonseca; João Quinta da Fonseca; Christian J. Peyton
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Data recorded in uniaxial compression for a Zr-2.5Nb alloy deformed at temperatures of 650C, 675C, 700C, 725C, 750C, 775C, 800C, 825C and 850C, at strain rates of 10-2.5, 10-2, 10-1.5, 10-1, 10-0.5 and 1 s-1, to 50% height reduction, using TA Instruments DIL 805 A/D/T Quenching and Deformation Dilatometer. The cylindrical samples measured 5 mm diameter and 10 mm height. The Zr-2.5Nb specimens were machined from the centre of an as-received forged plate manufactured at Wah Chang, with a beta-transformed starting microstructure. Si3N43 platens were used for all tests, with graphite lubricant applied at the ends of the sample to minimise friction. Tests were conducted in an inert He gas atmosphere. Temperature was controlled using an S-Type thermocouple spot-welded to the centre of the samples.

    Data recorded at high acquisition frequency during deformation is stored in the 'deformation_files' folder and saved with the format: ' test number (001 to 191)_temperature_log(strain rate)_repeat number (01 or 02)'. Data in the 'basic_files' folder is recorded at a lower acquisition frequency, but includes recording of the entire themomechanical cycle, including both heating and cooling stages, as well as deformation. The 'software_files' folder includes metadata stored in the form of a parameter file (.par and .pad), along with a Windows data file (.D5D) that can be loaded and analysed within the dilatometer user interface.

    An accompanying python script will allow the user to plot the stress-strain data using the Jupyter Notebook application, along with generating 'processing maps' of the material. A critical assessment of the application of 'processing maps' is included in the accompanying paper;

    C. S. Daniel, P. Jedrasiak, C. J. Peyton, J. Quinta da Fonseca, H. R. Shercliff, L. Bradley, and P. D.Honniball, “Quantifying Processing Map Uncertainties by Modeling the Hot-Compression Behavior of a Zr-2.5Nb Alloy,” in Zirconium in the Nuclear Industry: 19th International Symposium, ed. A. T. Motta and S. K. Yagnik (West Conshohocken, PA: ASTM International, 2021), 93–122. 10.1520/STP162220190031

  13. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Dataintelo (2025). File Compression Tools Market Report | Global Forecast From 2025 To 2033 [Dataset]. https://dataintelo.com/report/file-compression-tools-market
Organization logo

File Compression Tools Market Report | Global Forecast From 2025 To 2033

Explore at:
pptx, pdf, csvAvailable download formats
Dataset updated
Jan 7, 2025
Dataset authored and provided by
Dataintelo
License

https://dataintelo.com/privacy-and-policyhttps://dataintelo.com/privacy-and-policy

Time period covered
2024 - 2032
Area covered
Global
Description

File Compression Tools Market Outlook



The global file compression tools market size is projected to reach USD XX billion by 2032 from USD XX billion in 2023, growing at a CAGR of XX% during the forecast period. The steady growth in the market size is driven by the increasing digitization across various industries, which necessitates efficient data management solutions like file compression tools.



The growing volume of digital data generated daily is one of the primary growth factors for the file compression tools market. As businesses and individuals increasingly rely on digital platforms for communication and data storage, the need for efficient file compression tools becomes paramount. These tools help in saving storage space and reducing the time and bandwidth required for data transfer, thus enhancing overall productivity. Furthermore, the rise of cloud computing and the need for seamless data transfer between cloud and on-premises environments further boost the demand for advanced file compression solutions.



Another significant growth driver is the increasing adoption of high-definition content. With the proliferation of 4K and 8K videos, high-resolution images, and other large digital files, there is a growing need for robust file compression tools that can handle large file sizes without significant loss of quality. This trend is particularly prominent in the media and entertainment industry, which requires efficient compression tools to manage and distribute high-quality content swiftly. Additionally, the rising use of big data analytics across various sectors also contributes to the increased demand for file compression tools, as they help in efficiently managing and processing large datasets.



Technological advancements in compression algorithms are also propelling the growth of the file compression tools market. Modern compression techniques offer superior compression ratios and faster processing speeds, making them more efficient and reliable. The integration of artificial intelligence and machine learning algorithms in compression tools further enhances their performance, enabling more intelligent and adaptive compression strategies. This continuous innovation ensures that file compression tools remain relevant and capable of meeting the evolving needs of users.



In the context of file compression tools, the role of a Compression Driver is increasingly becoming pivotal. A Compression Driver is essentially a software component or a set of algorithms that manage the compression and decompression processes, ensuring optimal performance and efficiency. These drivers are crucial for maintaining the balance between compression speed and the quality of the compressed files. As data volumes continue to grow, the demand for more sophisticated Compression Drivers that can handle large datasets without compromising on speed or quality is on the rise. This is particularly important for industries that require real-time data processing and transmission, such as telecommunications and finance.



Regionally, North America dominates the file compression tools market, driven by the presence of major technology companies and high adoption rates of advanced digital solutions. The Asia Pacific region, however, is expected to witness the highest growth rate during the forecast period. The rapid digitization of economies, increasing internet penetration, and the proliferation of smartphones and other digital devices in countries like China and India are significant contributing factors. Europe also represents a substantial market, with a strong focus on data protection and efficient data management solutions.



Type Analysis



The file compression tools market is segmented into lossless compression and lossy compression. Lossless compression algorithms enable the complete restoration of the original file without any loss of data. This type of compression is particularly crucial for industries where data integrity is paramount, such as healthcare and BFSI. Lossless compression tools are widely used for compressing text files, databases, and other critical data that must remain unaltered. The increasing emphasis on data security and integrity is driving the demand for lossless compression solutions in these sectors.



In contrast, lossy compression algorithms achieve higher compression ratios by discarding some amount of data, which is generally imperceptible to human senses. This type of compression is ideal for media files su

Search
Clear search
Close search
Google apps
Main menu