Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset provides a granular, timestamped record of order book events—including order placements, modifications, cancellations, and trades—across multiple equity markets and exchanges. With microsecond-level precision and comprehensive event attributes, it is ideal for quantitative research, backtesting high-frequency trading strategies, and analyzing market microstructure dynamics.
Facebook
Twitterhttps://www.kappasignal.com/p/legal-disclaimer.htmlhttps://www.kappasignal.com/p/legal-disclaimer.html
This analysis presents a rigorous exploration of financial data, incorporating a diverse range of statistical features. By providing a robust foundation, it facilitates advanced research and innovative modeling techniques within the field of finance.
Historical daily stock prices (open, high, low, close, volume)
Fundamental data (e.g., market capitalization, price to earnings P/E ratio, dividend yield, earnings per share EPS, price to earnings growth, debt-to-equity ratio, price-to-book ratio, current ratio, free cash flow, projected earnings growth, return on equity, dividend payout ratio, price to sales ratio, credit rating)
Technical indicators (e.g., moving averages, RSI, MACD, average directional index, aroon oscillator, stochastic oscillator, on-balance volume, accumulation/distribution A/D line, parabolic SAR indicator, bollinger bands indicators, fibonacci, williams percent range, commodity channel index)
Feature engineering based on financial data and technical indicators
Sentiment analysis data from social media and news articles
Macroeconomic data (e.g., GDP, unemployment rate, interest rates, consumer spending, building permits, consumer confidence, inflation, producer price index, money supply, home sales, retail sales, bond yields)
Stock price prediction
Portfolio optimization
Algorithmic trading
Market sentiment analysis
Risk management
Researchers investigating the effectiveness of machine learning in stock market prediction
Analysts developing quantitative trading Buy/Sell strategies
Individuals interested in building their own stock market prediction models
Students learning about machine learning and financial applications
The dataset may include different levels of granularity (e.g., daily, hourly)
Data cleaning and preprocessing are essential before model training
Regular updates are recommended to maintain the accuracy and relevance of the data
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
The Vietnamese government introduced a change in the minimum tick size for stock trading on 12 September 2016 to improve market quality and reduce trade execution costs. The intended effects of this policy have not been widely investigated in an emerging market such as Vietnam. We use data on trade and quote intraday of all stocks listed on the Ho Chi Minh Stock Exchange for the periods before and after the event, with a one-week break from 12/9/2016 to 18/9/2016, for the market to adapt to the new tick size policy. Findings from this paper confirm that the trading cost is reduced following the change to the smallest tick size. However, this is different for large trades executed at the stock price associated with a larger tick size. Furthermore, the findings are robust with a different sample period. These findings imply that introducing a change in tick size in Vietnam in 2016 is desirable for improving market quality. However, the differentiation of these changes in different ranges of stock prices is not necessarily effective for improving market quality and reducing trade execution costs.
Facebook
TwitterThe Australian Securities Exchange (ASX) was established in July 2006 after the Australian Stock Exchange merged with the Sydney Futures Exchange, making it one of the top 20 global exchange groups by market capitalization. ASX facilitates trading in leading stocks, ETFs, derivatives, fixed income, commodities, and energy, commanding over 80% of the market share in the Australian Cash Market, with the S&P/ASX 200 as its main index. We offer comprehensive real-time market information services for all instruments in the ASX Level 1 and Level 2 (full market depth) products, and also provide Level 1 data as a delayed service. You can access this data through various means tailored to your specific needs and workflows, whether for trading via electronic low latency datafeeds, using our desktop services equipped with advanced analytical tools, or through our end-of-day valuation and risk management products.
Facebook
TwitterStock market enthusiasts can build strategy using it. The datset has top 10 indian stocks by market cap and their tick by tick price. The main application is to find relation between various stock prices. The data has 3 columns and more than 10000 rows for each stock, totaling 30 columns and 10000+ rows.
Facebook
Twitterhttps://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
Real and up to date stock market exchange of cryptocurrencies can be quite expensive and are hard to get. However, historical financial data are the starting point to develop algorithm(s) to analyze market trend and why not beat the market by predicting market movement.
Data provided in this dataset are historical data from the beginning of SAND-GBP pair market on Kraken exchange up to the present (2021 December). This data comes frome real trades on one of the most popular cryptocurrencies exchange.
Historical market data, also known as trading history, time and sales or tick data, provides a detailed record of every trade that happens on Kraken exchange, and includes the following information: - Timestamp - The exact date and time of each trade. - Price - The price at which each trade occurred. - Volume - The amount of volume that was traded.
In addition, OHLCVT data are provided for the most common period interval: 1 min, 5 min, 15 min, 1 hour, 12 hours and 1 day. OHLCVT stands for Open, High, Low, Close, Volume and Trades and represents the following trading information for each time period: - Open - The first traded price - High - The highest traded price - Low - The lowest traded price - Close - The final traded price - Volume - The total volume traded by all trades - Trades - The number of individual trades
Don't hesitate to tell me if you need other period interval 😉 ...
This dataset will be updated every quarter to add new and up to date market trend. Let me know if you need an update more frequently.
Can you beat the market? Let see what you can do with these data!
Facebook
TwitterExtensive and dependable pricing information spanning the entire range of financial markets. Encompassing worldwide coverage from stock exchanges, trading platforms, indicative contributed prices, assessed valuations, expert third-party sources, and our enhanced data offerings. User-friendly request-response, bulk access, and tailored desktop interfaces to meet nearly any organizational or application data need. Worldwide, real-time, delayed streaming, intraday updates, and meticulously curated end-of-day pricing information.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Gazprom stock price, live market quote, shares value, historical data, intraday chart, earnings per share and news.
Facebook
Twitterhttps://www.lseg.com/en/policies/website-disclaimerhttps://www.lseg.com/en/policies/website-disclaimer
Browse LSEG's Shanghai Stock Exchange (SSE) Data, and view multiple asset classes including equities, bonds, indices, funds and stock options.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Facebook
Twitterhttps://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
Real and up to date stock market exchange of cryptocurrencies can be quite expensive and are hard to get. However, historical financial data are the starting point to develop algorithm(s) to analyze market trend and why not beat the market by predicting market movement.
Data provided in this dataset are historical data from the beginning of ETC-ETH pair market on Kraken exchange up to the present (2021 December). This data comes frome real trades on one of the most popular cryptocurrencies exchange.
Historical market data, also known as trading history, time and sales or tick data, provides a detailed record of every trade that happens on Kraken exchange, and includes the following information: - Timestamp - The exact date and time of each trade. - Price - The price at which each trade occurred. - Volume - The amount of volume that was traded.
In addition, OHLCVT data are provided for the most common period interval: 1 min, 5 min, 15 min, 1 hour, 12 hours and 1 day. OHLCVT stands for Open, High, Low, Close, Volume and Trades and represents the following trading information for each time period: - Open - The first traded price - High - The highest traded price - Low - The lowest traded price - Close - The final traded price - Volume - The total volume traded by all trades - Trades - The number of individual trades
Don't hesitate to tell me if you need other period interval 😉 ...
This dataset will be updated every quarter to add new and up to date market trend. Let me know if you need an update more frequently.
Can you beat the market? Let see what you can do with these data!
Facebook
Twitter🟦 What this is This dataset provides full-depth liquidity grids for every Uniswap V3-style pool, covering the entire tick range (up to 1.5 million price points, from −887 272 to +887 272). Each snapshot reflects the theoretical price impact curve across all active ticks for a given token pair - a complete “liquidity surface” view of the pool state.
Key traits: • Schema-stable, versioned, and fully lineage-traceable • Tick-level resolution across the full Uniswap V3 range • Real-time (WSS) and historical coverage back to genesis
🌐 Chains / Coverage ETH, BSC, Base, Arbitrum, Unichain, Avalanche, Polygon, Celo, Linea, Optimism (others on request). Full history from chain genesis; reorg-aware real-time ingestion and updates. Covers all Uniswap V3-compatible pools; for Uniswap V2-style AMMs, only aggregate reserves are exposed (no grid) so clients can compute impact via the formula below.
🧮 Formula amount_in_max = (1 / fee) × ( √( (reserves_out × reserves_in) / target_price ) − reserves_in )
Conditions • The target price must be lower than the current price, i.e. reserves_out / reserves_in > target_price • Formula assumes a constant product AMM: reserves_in × reserves_out = k
📑 Schema token_to_token_l3_snapshots • snapshot_id BIGINT - unique identifier for each grid snapshot. • pool_uid BIGINT - reference to the liquidity pool (liquidity_pools.uid). • tracing_id BYTEA - unique trace identifier for this row. • parent_tracing_ids BYTEA - lineage reference to parent records. • genesis_tracing_ids BYTEA - lineage reference to original on-chain sources. • genesis_block_number BIGINT - block number of the originating event. • genesis_tx_index INTEGER - transaction index within that block. • genesis_log_index INTEGER - log index within the transaction. • token_in BYTEA - 20-byte ERC-20 address of the input token. • token_out BYTEA - 20-byte ERC-20 address of the output token. • current_price NUMERIC(78,18) - mid-price (token_out per 1 token_in, decimals-adjusted). • grid_step_bps SMALLINT - spacing between grid points, in basis points. • grid_radius_bps INTEGER - total radius of the grid window, in basis points. • grid_points SMALLINT - number of grid points; must equal radius/step + 1. • reserves_in_adj NUMERIC(78,18) - adjusted reserve amount of the input token (decimals-normalized). • reserves_out_adj NUMERIC(78,18) - adjusted reserve amount of the output token (decimals-normalized).
token_to_token_l3_points • snapshot_id BIGINT - reference to the parent snapshot (token_to_token_l3_snapshots.snapshot_id). • point_index SMALLINT - sequential index (0 … grid_points − 1). • offset_bps_abs INTEGER - absolute offset from the mid-price, in basis points. • size_in NUMERIC(78,18) - executable input amount required to reach this offset. • size_out NUMERIC(78,18) - corresponding output amount at that offset. • price_at_point NUMERIC(78,18) - average price (out / in) including impact.
🔑 Keys & Joins • Primary keys: • token_to_token_l3_snapshots(snapshot_id) • token_to_token_l3_points(snapshot_id, point_index) • Lineage triple: (genesis_block_number, genesis_tx_index, genesis_log_index) • Foreign keys: • pool_uid → liquidity_pools(uid) • token_in, token_out → erc20_tokens(contract_address) • (genesis_block_number, genesis_tx_index, genesis_log_index) → logs(block_number, tx_index, log_index) • snapshot_id → token_to_token_l3_snapshots(snapshot_id)
🧬 Lineage & Reproducibility Every grid snapshot has a verifiable path back to the originating raw events via the lineage triple and tracing graph: • tracing_id - this row’s identity • parent_tracing_ids - immediate sources • genesis_tracing_ids - original on-chain sources This supports audits and exact reprocessing to source transactions/logs/function calls.
📈 Common uses • Accurate execution cost and slippage modeling at tick-level resolution • Market microstructure research and volatility surface mapping • Liquidity visualization tools for Uniswap V3 / V4 pools • Strategy backtesting using real tick granularity • Price impact forecasting and routing optimization
🚚 Delivery By default • WebSocket (WSS) reorg-aware live emissions when a new update is available; <140 ms median latency on ETH streams (7-day). • SFTP server for archives and daily End-of-Day (EOD) snapshots. • Model Context Protocol (MCP) for AI workflows (pull slices, schemas, lineage). Optional • Integrations to Amazon S3, Azure Blob Storage, Snowflake, and other enterprise platforms on request.
🗂️ Files (time-partitioned in UTC, compressed) • Parquet • CSV • XLS • JSON
💡 Quality and operations • Reorg-aware ingestion. • 99.95% uptime target SLA. • Backfills to chain genesis. • Versioned, schema-stable datasets; changes are additive and announced.
🔄 Change policy Schema is stable. Any breaking change ships as a new version (e.g., token_to_token_l3_v2) with migration notes. Content updates are additive (new row...
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Total-Stockholder-Equity Time Series for Bavarian Nordic. Bavarian Nordic A/S develops, manufactures, and supplies life-saving vaccines. The company offers non-replicating smallpox and monkeypox vaccines under the IMVAMUNE, IMVANEX, and JYNNEOS names; rabies vaccine for human use under the Rabipur/RabAvert name; tick-borne encephalitis vaccine under the Encepur name; Vaxchora, an oral vaccine for immunization against cholera; and Vivotif/Typhoral, an oral vaccine for immunization against typhoid fever. It is also developing MVA-BN WEV for the treatment of encephalitis viruses. It operates in the United States, Denmark, Canada, France, Germany, Singapore, Finland, Netherlands, Switzerland, Sweden, Taiwan, Saudi Arabia, Belgium, and internationally. Bavarian Nordic A/S was incorporated in 1992 and is headquartered in Hellerup, Denmark.
Facebook
TwitterAttribution-NonCommercial-ShareAlike 4.0 (CC BY-NC-SA 4.0)https://creativecommons.org/licenses/by-nc-sa/4.0/
License information was derived automatically
The data provided here as part of the DEBS 2022 Grand Challenge is based on real tick data captured by Infront Financial Technology GmbH for the complete week of November 8th to 14th, 2021 (i.e., five trading days Monday to Friday + Saturday and Sunday). The data set contains 289 million tick data events covering 5504 equities and indices that are traded on three European exchanges: Paris (FR), Amsterdam (NL), and Frankfurt (ETR).
Some event notifications appear to come with no payload. This is due to the fact that the 2022 GC requires only a small subset of attributes to be evaluated; other attributes have been eliminated from the data set to minimize its overall size while keeping the amount of events to process unchanged.
Further details on the data set, its syntax and its semantics can be found in the official DEBS 2022 Grand Challenge paper as part of the DEBS 2022 conference proceedings (please use this for citation):
Sebastian Frischbier, Jawad Tahir, Christoph Doblander, Arne Hormann, Ruben Mayer, and Hans-Arno Jacobsen. 2022. The DEBS 2022 Grand Challenge: Detecting Trading Trends in Financial Tick Data. In The 16th ACM International Conference on Distributed and Event-based Systems (DEBS ’22), June 27-June 30, 2022, Copenhagen. ACM, New York, NY, USA.
All files of the DEBS 2022 Grand Challenge Data Set “Trading Data” are provided as-is. By downloading and using this data you agree to the terms and conditions of the licensing agreement (CC by-nc-sa).
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
An example of TRTH intraday top-of-book transaction data for a single Johannesburg Stock Exchange (JSE) listed equity. The data is for teaching, learning and research projects sourced from the legacy Tick History v1 SOAP API interface from https://tickhistory.thomsonreuters.com/TickHistory in May 2016. Related raw data and similar data-structures can now be accessed using Tick History v2 and the REST API https://hosted.datascopeapi.reuters.com/RestApi/v1.
Configuration control: the test dataset contains 16 CSV files with names: "
Attributes: The data set is for the ticker: AGLJ.J from May 2010 until May 2016. The files include the following attributes: RIC, Local Date-Time, Event Type, Price at the Event, Volume at the Event, Best Bid Changes, Best Ask Changes, and Trade Event Sign: RIC, DateTimeL, Type, Price, Volume, L1 Bid, L1 Ask, Trade Sign. The Local Date-Time (DateTimeL) is a serial date number where 1 corresponds to Jan-1-0000, for example, 736333.382013 corresponds to 4-Jan-2016 09:10:05 (or 20160104T091005 in ISO 8601 format). The trade event sign (Trade Sign) indicates whether the transaction was buyer (or seller) initiated as +1 (-1) and was prepared using the method of Lee and Ready (2008).
Disclaimer: The data is not up-to-date, is incomplete, it has been pre-processed; as such it is not fit for any other purpose than teaching and learning, and algorithm testing. For complete, up-to-date, and error-free data please use the Tick History v2 interface directly.
Research Objectives: The data has been used to build empirical evidence in support of hierarchical causality and universality in financial markets by considering price impact on different time and averaging scales, feature selection on different scales as inputs into scale dependent machine learning applications, and for various aspects of agent-based model calibration and market ecology studies on different time and averaging scales.
Acknowledgements to: Diane Wilcox, Dieter Hendricks, Michael Harvey, Fayyaaz Loonat, Michael Gant, Nicholas Murphy and Donovan Platt.
Facebook
Twitter🟦 What this is This dataset represents Level-2 (L2) liquidity impact grids - quantized snapshots of executable depth and pricing for each token pair within a liquidity pool. Each row captures the impact curve around mid-price (e.g., ±10%) in equal steps, showing how much can be executed before the price shifts.
🌐 Chains / Coverage ETH, BSC, Base, Arbitrum, Unichain, Avalanche, Polygon, Celo, Linea, Optimism (others on request). Full history from chain genesis; reorg-aware real-time ingestion and updates.
📑 Schema token_to_token_l2_snapshots • snapshot_id BIGINT - unique identifier for each grid snapshot. • pool_uid BIGINT - reference to the liquidity pool (liquidity_pools.uid). • tracing_id BYTEA - unique trace identifier for this row. • parent_tracing_ids BYTEA - lineage reference to parent records. • genesis_tracing_ids BYTEA - lineage reference to original on-chain sources. • genesis_block_number BIGINT - block number of the originating event. • genesis_tx_index INTEGER - transaction index within that block. • genesis_log_index INTEGER - log index within the transaction. • token_in BYTEA - 20-byte ERC-20 address of the input token. • token_out BYTEA - 20-byte ERC-20 address of the output token. • current_price NUMERIC(78,18) - mid-price (token_out per 1 token_in, decimals-adjusted). • grid_step_bps SMALLINT - spacing between grid points, in basis points. • grid_radius_bps INTEGER - total radius of the grid window, in basis points. • grid_points SMALLINT - number of grid points; must equal radius/step + 1.
token_to_token_l2_points • snapshot_id BIGINT - reference to the parent snapshot (token_to_token_l2_snapshots.snapshot_id). • point_index SMALLINT - sequential index (0 … grid_points − 1). • offset_bps_abs INTEGER - absolute offset from the mid-price, in basis points. • size_in NUMERIC(78,18) - executable input amount required to reach this offset. • size_out NUMERIC(78,18) - corresponding output amount at that offset. • price_at_point NUMERIC(78,18) - average price (out / in) including impact.
🔑 Keys & Joins Primary keys: • token_to_token_l2_snapshots(snapshot_id) • token_to_token_l2_points(snapshot_id, point_index) Lineage triple: (genesis_block_number, genesis_tx_index, genesis_log_index) Foreign keys: • pool_uid → liquidity_pools(uid) • token_in, token_out → erc20_tokens(contract_address) • (genesis_block_number, genesis_tx_index, genesis_log_index) → logs(block_number, tx_index, log_index) • snapshot_id → token_to_token_l2_snapshots(snapshot_id)
🧬 Lineage & Reproducibility Every grid snapshot has a verifiable path back to the originating raw events via the lineage triple and tracing graph: • tracing_id - this row’s identity • parent_tracing_ids - immediate sources • genesis_tracing_ids - original on-chain sources This supports audits and exact reprocessing to source transactions/logs/function calls.
📈 Common uses • Depth & impact modeling for execution algorithms • Market-making simulation and slippage profiling • MEV and arbitrage detection at grid resolution • Feature engineering for price impact models • Real-time liquidity monitoring and alerting
🚚 Delivery By default • WebSocket (WSS) reorg-aware live emissions when a new update is available; <140 ms median latency on ETH streams (7-day). • SFTP server for archives and daily End-of-Day (EOD) snapshots. • Model Context Protocol (MCP) for AI workflows (pull slices, schemas, lineage). Optional • Integrations to Amazon S3, Azure Blob Storage, Snowflake, and other enterprise platforms on request.
🗂️ Files (time-partitioned in UTC, compressed) • Parquet • CSV • XLS • JSON
💡 Quality and operations • Reorg-aware ingestion. • 99.95% uptime target SLA. • Backfills to chain genesis. • Versioned, schema-stable datasets; changes are additive and announced.
🔄 Change policy Schema is stable. Any breaking change ships as a new version (e.g., token_to_token_l2_v2) with migration notes. Content updates are additive (new rows/fields filled); types aren’t changed in place.
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Free-Cash-Flow-To-Equity Time Series for Bavarian Nordic. Bavarian Nordic A/S develops, manufactures, and supplies life-saving vaccines. The company offers non-replicating smallpox and monkeypox vaccines under the IMVAMUNE, IMVANEX, and JYNNEOS names; rabies vaccine for human use under the Rabipur/RabAvert name; tick-borne encephalitis vaccine under the Encepur name; Vaxchora, an oral vaccine for immunization against cholera; and Vivotif/Typhoral, an oral vaccine for immunization against typhoid fever. It is also developing MVA-BN WEV for the treatment of encephalitis viruses. It operates in the United States, Denmark, Canada, France, Germany, Singapore, Finland, Netherlands, Switzerland, Sweden, Taiwan, Saudi Arabia, Belgium, and internationally. Bavarian Nordic A/S was incorporated in 1992 and is headquartered in Hellerup, Denmark.
Facebook
Twitterhttps://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
Real and up to date stock market exchange of cryptocurrencies can be quite expensive and are hard to get. However, historical financial data are the starting point to develop algorithm(s) to analyze market trend and why not beat the market by predicting market movement.
Data provided in this dataset are historical data from the beginning of XBT-CHF pair market on Kraken exchange up to the present (2021 December). This data comes frome real trades on one of the most popular cryptocurrencies exchange.
Historical market data, also known as trading history, time and sales or tick data, provides a detailed record of every trade that happens on Kraken exchange, and includes the following information: - Timestamp - The exact date and time of each trade. - Price - The price at which each trade occurred. - Volume - The amount of volume that was traded.
In addition, OHLCVT data are provided for the most common period interval: 1 min, 5 min, 15 min, 1 hour, 12 hours and 1 day. OHLCVT stands for Open, High, Low, Close, Volume and Trades and represents the following trading information for each time period: - Open - The first traded price - High - The highest traded price - Low - The lowest traded price - Close - The final traded price - Volume - The total volume traded by all trades - Trades - The number of individual trades
Don't hesitate to tell me if you need other period interval 😉 ...
This dataset will be updated every quarter to add new and up to date market trend. Let me know if you need an update more frequently.
Can you beat the market? Let see what you can do with these data!
Facebook
TwitterAttribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically
Average quoted, effective, and realized bid-ask spreads by location of trade price relative to the quote.
Facebook
Twitterhttps://creativecommons.org/publicdomain/zero/1.0/https://creativecommons.org/publicdomain/zero/1.0/
Real and up to date stock market exchange of cryptocurrencies can be quite expensive and are hard to get. However, historical financial data are the starting point to develop algorithm(s) to analyze market trend and why not beat the market by predicting market movement.
Data provided in this dataset are historical data from the beginning of LINK-USD pair market on Kraken exchange up to the present (2021 December). This data comes frome real trades on one of the most popular cryptocurrencies exchange.
Historical market data, also known as trading history, time and sales or tick data, provides a detailed record of every trade that happens on Kraken exchange, and includes the following information: - Timestamp - The exact date and time of each trade. - Price - The price at which each trade occurred. - Volume - The amount of volume that was traded.
In addition, OHLCVT data are provided for the most common period interval: 1 min, 5 min, 15 min, 1 hour, 12 hours and 1 day. OHLCVT stands for Open, High, Low, Close, Volume and Trades and represents the following trading information for each time period: - Open - The first traded price - High - The highest traded price - Low - The lowest traded price - Close - The final traded price - Volume - The total volume traded by all trades - Trades - The number of individual trades
Don't hesitate to tell me if you need other period interval 😉 ...
This dataset will be updated every quarter to add new and up to date market trend. Let me know if you need an update more frequently.
Can you beat the market? Let see what you can do with these data!
Facebook
TwitterCC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
License information was derived automatically
This dataset provides a granular, timestamped record of order book events—including order placements, modifications, cancellations, and trades—across multiple equity markets and exchanges. With microsecond-level precision and comprehensive event attributes, it is ideal for quantitative research, backtesting high-frequency trading strategies, and analyzing market microstructure dynamics.