100+ datasets found
  1. Event logs for process mining

    • kaggle.com
    zip
    Updated Apr 11, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alberto (2023). Event logs for process mining [Dataset]. https://www.kaggle.com/datasets/carlosalvite/car-insurance-claims-event-log-for-process-mining
    Explore at:
    zip(4892593 bytes)Available download formats
    Dataset updated
    Apr 11, 2023
    Authors
    Alberto
    License

    http://opendatacommons.org/licenses/dbcl/1.0/http://opendatacommons.org/licenses/dbcl/1.0/

    Description

    Description This event log has been artificially generated and curated to provide a comprehensive view of car insurance claims, allowing users to discover and identify bottlenecks, automation opportunities, conformance issues, reworks, and potential fraudulent cases using any process mining software.

    Looking for more event logs, use cases and training material?

    👉 Join our Skool community

    You'll find a hands-on training course and realistic set of 9+ event logs.

    Standard Process flow: “First Notification of Loss (FNOL)” -> “Assign Claim” -> “Claim Decision” -> “Set Reserve” -> “Payment Sent” -> “Close Claim”

    Attributes: - case ID - activity name - timestamp - claimant name - agent name - adjuster name - claim amount - claimant age - type of policy - car make - car model - car year - date and time of the accident - type of accident - user type

    Total number of claims: 30,000

    Dates: Claims belong to years 2020, 2021, and 2022.

    Disclaimer: Personal names are fake.

  2. Event Log Sampling Datasets

    • figshare.com
    zip
    Updated Jul 22, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    CONG LIU (2022). Event Log Sampling Datasets [Dataset]. http://doi.org/10.6084/m9.figshare.20354505.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 22, 2022
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    CONG LIU
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This datasets includes 9 event logs, which can be used to experiment with log completeness-oriented event log sampling methods.

    · exercise.xes: The dataset is a simulation log generated by the paper review process model, and each trace clearly describes the process of reviewing papers in detail.

    · training_log_1/3/8.xes: These 3 datasets are human-trained simulation logs for the 2016 Process Discovery Competition (PDC 2016). Each trace consists of two values, the name of the process model activity referenced by the event and the identifier of the case to which the event belongs.

    · Production.xes: This dataset includes process data from production processes, and each track includes data for cases, activities, resources, timestamps, and more data fields.

    · BPIC_2012_A/O/W.xes: These 3 dataset are derived from the personal loan application process of a financial institution in the Netherlands. The process represented in the event log is the application process of a personal loan or overdraft in a global financing organization. Each trace describes the process of applying for a personal loan for different customers.

    · CrossHospital.xes: The dataset includes the treatment process data of emergency patients in the hospital, and each track represents the treatment process of an emergency patient in the hospital.

  3. incident-event-log-dataset

    • kaggle.com
    zip
    Updated Oct 10, 2024
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Mithil Kotawadekar (2024). incident-event-log-dataset [Dataset]. https://www.kaggle.com/datasets/mithilkotawadekar/incident-event-log-dataset
    Explore at:
    zip(2584951 bytes)Available download formats
    Dataset updated
    Oct 10, 2024
    Authors
    Mithil Kotawadekar
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    I utilized a publicly accessible dataset (creators: Claudio Amaral, Marcelo Fantinato and Sarajane Peres), with slight modifications, for my academic work. This is an event log of an incident management process extracted from data gathered from the audit system of an instance of the ServiceNowTM platform used by an IT company. The event log is enriched with data loaded from a relational database underlying a corresponding process-aware information system. Information was anonymized for privacy.

  4. Process Mining Event Log - Incident Management

    • kaggle.com
    zip
    Updated Apr 20, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alberto P (2025). Process Mining Event Log - Incident Management [Dataset]. https://www.kaggle.com/datasets/albertopmd/process-mining-event-log-incident-management
    Explore at:
    zip(2301112 bytes)Available download formats
    Dataset updated
    Apr 20, 2025
    Authors
    Alberto P
    License

    MIT Licensehttps://opensource.org/licenses/MIT
    License information was derived automatically

    Description

    This realistic incident management event log simulates a common IT service process and includes key inefficiencies found in real-world operations. You'll uncover SLA violations, multiple reassignments, bottlenecks, and conformance issues—making it an ideal dataset for hands-on process mining, root cause analysis, and performance optimization exercises.

    Looking for more event logs and use cases?

    This hands-on process mining training focuses on real projects, includes 9 realistic event logs, a practical analysis framework, lifetime access, community support, and a 14-day money-back guarantee. No fluff, no long theory. Focused on real-world results.

    As Kaggle user, you can get a $100 discount using code XM7SB8Y: 👉 https://processminingacademy.com/course-sales-kg-8068

    Standard Process Flow: Ticket Created -> Ticket Assigned to Level 1 Support -> WIP - Level 1 Support -> Level 1 Escalates to Level 2 Support -> WIP - Level 2 Support -> Ticket Solved by Level 2 Support -> Customer Feedback Received -> Ticket Closed

    Total Number of Incident Tickets: 31,000+

    Process Variants: 13

    Number of Events: 242,000+

    Year: 2023

    File Format: CSV

    File Size: 65MB

  5. Event Logs CSV

    • figshare.com
    rar
    Updated Dec 9, 2019
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Dina Bayomie (2019). Event Logs CSV [Dataset]. http://doi.org/10.6084/m9.figshare.11342063.v1
    Explore at:
    rarAvailable download formats
    Dataset updated
    Dec 9, 2019
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Dina Bayomie
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    The event logs in CSV format. The dataset contains both correlated and uncorrelated logs

  6. Windows Event Log

    • kaggle.com
    zip
    Updated Nov 9, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    boomerang476 (2023). Windows Event Log [Dataset]. https://www.kaggle.com/datasets/boomerang476/windows-event-log
    Explore at:
    zip(1667784378 bytes)Available download formats
    Dataset updated
    Nov 9, 2023
    Authors
    boomerang476
    Description

    Dataset

    This dataset was created by boomerang476

    Contents

  7. p

    Data from: MIMICEL: MIMIC-IV Event Log for Emergency Department

    • physionet.org
    Updated Jun 16, 2023
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Jia Wei; Zhipeng He; Chun Ouyang; Catarina Moreira (2023). MIMICEL: MIMIC-IV Event Log for Emergency Department [Dataset]. http://doi.org/10.13026/c9yj-1t90
    Explore at:
    Dataset updated
    Jun 16, 2023
    Authors
    Jia Wei; Zhipeng He; Chun Ouyang; Catarina Moreira
    License

    https://github.com/MIT-LCP/license-and-dua/tree/master/draftshttps://github.com/MIT-LCP/license-and-dua/tree/master/drafts

    Description

    In this work, we extract an event log from the MIMIC-IV-ED dataset by adopting a well-established event log generation methodology, and we name this event log MIMICEL. The data tables in the MIMIC-IV-ED dataset relate to each other based on the existing relational database schema, and each table records the individual activities of patients along their journey in the emergency department (ED). While the data tables in the MIMIC-IV-ED dataset catch snapshots of a patient journey in the ED, the extracted event log MIMICEL aims to capture an end-to-end process of the patient journey. This will enable us to analyse the existing patient flows, thereby improving the efficiency of an ED process.

  8. Unlabelled Event Log Datasets and Experimental Results

    • figshare.com
    zip
    Updated Jun 24, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    CONG LIU (2022). Unlabelled Event Log Datasets and Experimental Results [Dataset]. http://doi.org/10.6084/m9.figshare.20138267.v1
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jun 24, 2022
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    CONG LIU
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Unlabelled Event Log Datasets

  9. Incident_event_log_dataset

    • kaggle.com
    zip
    Updated Mar 24, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    winmedals (2022). Incident_event_log_dataset [Dataset]. https://www.kaggle.com/datasets/winmedals/incident-event-log-dataset
    Explore at:
    zip(2571433 bytes)Available download formats
    Dataset updated
    Mar 24, 2022
    Authors
    winmedals
    Description

    Source: https://archive.ics.uci.edu/ml/datasets/Incident+management+process+enriched+event+log

    Reposting as kaggle dataset for convenience and fast usage

  10. Dataset of mHealth event logs

    • figshare.com
    pdf
    Updated May 1, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Raoul Nuijten; Pieter Van Gorp (2022). Dataset of mHealth event logs [Dataset]. http://doi.org/10.6084/m9.figshare.19688730.v2
    Explore at:
    pdfAvailable download formats
    Dataset updated
    May 1, 2022
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Raoul Nuijten; Pieter Van Gorp
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    How does Facebook always seems to know what the next funny video should be to sustain your attention with the platform? Facebook has not asked you whether you like videos of cats doing something funny: They just seem to know. In fact, FaceBook learns through your behavior on the platform (e.g., how long have you engaged with similar movies, what posts have you previously liked or commented on, etc.). As a result, Facebook is able to sustain the attention of their user for a long time. On the other hand, the typical mHealth apps suffer from rapidly collapsing user engagement levels. To sustain engagement levels, mHealth apps nowadays employ all sorts of intervention strategies. Of course, it would be powerful to know—like Facebook knows—what strategy should be presented to what individual to sustain their engagement. To be able to do that, the first step could be to be able to cluster similar users (and then derive intervention strategies from there). This dataset was collected through a single mHealth app over 8 different mHealth campaigns (i.e., scientific studies). Using this dataset, one could derive clusters from app user event data. One approach could be to differentiate between two phases: a process mining phase and a clustering phase. In the process mining phase one may derive from the dataset the processes (i.e., sequences of app actions) that users undertake. In the clustering phase, based on the processes different users engaged in, one may cluster similar users (i.e., users that perform similar sequences of app actions).

    List of files

    0-list-of-variables.pdf includes an overview of different variables within the dataset. 1-description-of-endpoints.pdf includes a description of the unique endpoints that appear in the dataset. 2-requests.csv includes the dataset with actual app user event data. 2-requests-by-session.csv includes the dataset with actual app user event data with a session variable, to differentiate between user requests that were made in the same session.

  11. T

    CeLOE event log sample

    • dataverse.telkomuniversity.ac.id
    tsv
    Updated Apr 20, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Telkom University Dataverse (2022). CeLOE event log sample [Dataset]. http://doi.org/10.34820/FK2/9FT77M
    Explore at:
    tsv(10066), tsv(19847)Available download formats
    Dataset updated
    Apr 20, 2022
    Dataset provided by
    Telkom University Dataverse
    License

    CC0 1.0 Universal Public Domain Dedicationhttps://creativecommons.org/publicdomain/zero/1.0/
    License information was derived automatically

    Description

    This study analyses an event log, automatically generated by the CeLOE LMS, that records student and lecturer activities in learning. The event log is mined to obtain a process model representing learning behaviours of the lecturers and students during the learning process. The case study in this research is learning in the study program 365 during the first semester of 2020/2021.

  12. Event Log Datasets

    • figshare.com
    csv
    Updated Jul 15, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Qi Mo (2025). Event Log Datasets [Dataset]. http://doi.org/10.6084/m9.figshare.29568722.v1
    Explore at:
    csvAvailable download formats
    Dataset updated
    Jul 15, 2025
    Dataset provided by
    figshare
    Figsharehttp://figshare.com/
    Authors
    Qi Mo
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Twelve public event log datasets for Collaborative Business Processes (CBPs) are presented here. Among them, the event logs Log-01~Log-04 are collected from some CBPs for the treatment of certain diseases such as gastric ulcer and diabetes in the hospital, the event logs Log-05~Log-07 are collected from some CBPs for designing and manufacturing products such as automobiles, the event logs Log-08~Log-09 are collected from some CBPs for financial services such as bank loans, and the event logs Log-10~Log-12 are collected from some CBPs in e-commerce such as return processing in online stores.

  13. Site A1 - Event Log / Derived Data

    • osti.gov
    Updated Jul 2, 2025
    + more versions
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Pacific Northwest National Laboratory (2025). Site A1 - Event Log / Derived Data [Dataset]. http://doi.org/10.21947/2568471
    Explore at:
    Dataset updated
    Jul 2, 2025
    Dataset provided by
    United States Department of Energyhttp://energy.gov/
    Office of Energy Efficiency and Renewable Energyhttp://energy.gov/eere
    Pacific Northwest National Laboratory
    Description

    This dataset contains the event log table with 10-minute wind statistics from the scanning lidar at AWAKEN's site A1. This is a good dataset to start from for people unfamiliar with the AWAKEN project.

  14. Object-Centric Event Log (OCEL) of the Enron Email Dataset

    • zenodo.org
    bin
    Updated May 26, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Alessandro Berti; Alessandro Berti (2025). Object-Centric Event Log (OCEL) of the Enron Email Dataset [Dataset]. http://doi.org/10.5281/zenodo.15516869
    Explore at:
    binAvailable download formats
    Dataset updated
    May 26, 2025
    Dataset provided by
    Zenodohttp://zenodo.org/
    Authors
    Alessandro Berti; Alessandro Berti
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    Description:

    This dataset provides an object-centric event log (OCEL) representation of the publicly available Enron email corpus. The OCEL format allows for a richer analysis of interconnected processes and objects, making it particularly suitable for advanced process mining techniques, communication pattern analysis, and social network exploration.

    The event logs were generated from a pre-processed CSV version of the Enron emails using a custom Python script leveraging the PM4Py library. The script parses individual emails to extract key information, including:

    • Timestamps: Derived from the 'Date' field of emails, parsed into timezone-aware datetime objects.
    • Activities: Inferred from email subject prefixes (e.g., "Re:" becomes "Response", "Fw:" becomes "Forwarding", "Invitation:" becomes "Invitation"). Emails without recognized prefixes are assigned a "Default" activity.
    • Objects: Two primary object types are identified:
      • EMAILADDRESS: Extracted from 'From', 'To', and 'Cc' fields.
      • MESSAGEID: Extracted from 'Message-ID', 'In-Reply-To', and 'References' fields, prefixed with "MID_" in the OCEL to ensure unique object identifiers across types.
    • Attributes: Event attributes include the original cleaned subject and content of the email.
    • Relationships: Events (emails) are linked to EMAILADDRESS objects with qualifiers 'FROM', 'TO', or 'CC'. Events are linked to MESSAGEID objects with qualifiers 'MESSAGEID' (for the email's own ID), 'INREPLYTO', or 'REFERENCES' to trace conversational threads.

    To accommodate various analytical needs and computational resources, the dataset is provided in three distinct checkpoints:

    1. Top 10,000 Emails: An OCEL generated from the first 10,000 emails processed.
    2. Top 100,000 Emails: An OCEL generated from the first 100,000 emails processed.
    3. All Emails: An OCEL generated from all emails processed by the script from the input emails.csv file.

    Each checkpoint is available in the .jsonocel format (OCEL 2.0 standard), ready for use with PM4Py and other OCEL-compatible process mining tools. This dataset can be valuable for researchers and practitioners seeking to apply object-centric process discovery, conformance checking, and enhancement techniques to a large, real-world communication log.

    Keywords: Object-Centric Event Log, OCEL, Process Mining, Enron Dataset, Email Analysis, Communication Networks, Social Network Analysis, PM4Py

  15. u

    Public benchmark dataset for Conformance Checking in Process Mining

    • figshare.unimelb.edu.au
    • melbourne.figshare.com
    xml
    Updated Jan 30, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Daniel Reissner (2022). Public benchmark dataset for Conformance Checking in Process Mining [Dataset]. http://doi.org/10.26188/5cd91d0d3adaa
    Explore at:
    xmlAvailable download formats
    Dataset updated
    Jan 30, 2022
    Dataset provided by
    The University of Melbourne
    Authors
    Daniel Reissner
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains a variety of publicly available real-life event logs. We derived two types of Petri nets for each event log with two state-of-the-art process miners : Inductive Miner (IM) and Split Miner (SM). Each event log-Petri net pair is intended for evaluating the scalability of existing conformance checking techniques.We used this data-set to evaluate the scalability of the S-Component approach for measuring fitness. The dataset contains tables of descriptive statistics of both process models and event logs. In addition, this dataset includes the results in terms of time performance measured in milliseconds for several approaches for both multi-threaded and single-threaded executions. Last, the dataset contains a cost-comparison of different approaches and reports on the degree of over-approximation of the S-Components approach. The description of the compared conformance checking techniques can be found here: https://arxiv.org/abs/1910.09767. Update:The dataset has been extended with the event logs of the BPIC18 and BPIC19 logs. BPIC19 is actually a collection of four different processes and thus was split into four event logs. For each of the additional five event logs, again, two process models have been mined with inductive and split miner. We used the extended dataset to test the scalability of our tandem repeats approach for measuring fitness. The dataset now contains updated tables of log and model statistics as well as tables of the conducted experiments measuring execution time and raw fitness cost of various fitness approaches. The description of the compared conformance checking techniques can be found here: https://arxiv.org/abs/2004.01781.Update: The dataset has also been used to measure the scalability of a new Generalization measure based on concurrent and repetitive patterns. : A concurrency oracle is used in tandem with partial orders to identify concurrent patterns in the log that are tested against parallel blocks in the process model. Tandem repeats are used with various trace reduction and extensions to define repetitive patterns in the log that are tested against loops in the process model. Each pattern is assigned a partial fulfillment. The generalization is then the average of pattern fulfillments weighted by the trace counts for which the patterns have been observed. The dataset no includes the time results and a breakdown of Generalization values for the dataset.

  16. i

    Data from: Synthetic Event Logs for Concept Drift Detection

    • ieee-dataport.org
    Updated May 18, 2022
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Victor Gallego-Fontenla (2022). Synthetic Event Logs for Concept Drift Detection [Dataset]. https://ieee-dataport.org/open-access/synthetic-event-logs-concept-drift-detection
    Explore at:
    Dataset updated
    May 18, 2022
    Authors
    Victor Gallego-Fontenla
    Description

    Real life business processes change over time

  17. i

    WTMP Event Log

    • ieee-dataport.org
    • data.mendeley.com
    Updated Jun 9, 2025
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Huiling Li (2025). WTMP Event Log [Dataset]. https://ieee-dataport.org/documents/wtmp-event-log
    Explore at:
    Dataset updated
    Jun 9, 2025
    Authors
    Huiling Li
    Description

    China.

  18. f

    JUnit 4.12 Software Event Log

    • figshare.com
    txt
    Updated Jun 4, 2023
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Maikel Leemans (2023). JUnit 4.12 Software Event Log [Dataset]. http://doi.org/10.4121/uuid:cfed8007-91c8-4b12-98d8-f233e5cd25bb
    Explore at:
    txtAvailable download formats
    Dataset updated
    Jun 4, 2023
    Dataset provided by
    4TU.ResearchData
    Authors
    Maikel Leemans
    License

    https://doi.org/10.4121/resource:terms_of_usehttps://doi.org/10.4121/resource:terms_of_use

    Description

    XES software event log obtained through instrumenting JUnit 4.12 using the tool available at {https://svn.win.tue.nl/repos/prom/XPort/}. This event log contains method-call level events describing a single run of the JUnit 4.12 software, available at {https://mvnrepository.com/artifact/junit/junit/4.12} , using the input from {https://github.com/junit-team/junit4/wiki/Getting-started}. Note that the life-cycle information in this log corresponds to method call (start) and return (complete), and captures a method-call hierarchy.

  19. Z

    Process Models obtained from event logs with with different...

    • data-staging.niaid.nih.gov
    • data.niaid.nih.gov
    • +1more
    Updated Jan 24, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Sander J.J. Leemans (2020). Process Models obtained from event logs with with different information-preserving abstractions [Dataset]. https://data-staging.niaid.nih.gov/resources?id=zenodo_3243987
    Explore at:
    Dataset updated
    Jan 24, 2020
    Dataset provided by
    Sander J.J. Leemans
    Dirk Fahland
    License

    Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
    License information was derived automatically

    Description

    This dataset contains results of the experiment to analyze information preservation and recovery by different event log abstractions in process mining described in: Sander J.J. Leemans, Dirk Fahland "Information-Preserving Abstractions of Event Data in Process Mining" Knowledge and Information Systems, ISSN: 0219-1377 (Print) 0219-3116 (Online), accepted May 2019

    The experiment results were obtained with: https://doi.org/10.5281/zenodo.3243981

  20. 4

    Validation of Precision Measures - Event Logs and Process Models

    • data.4tu.nl
    zip
    Updated Jul 28, 2020
    Share
    FacebookFacebook
    TwitterTwitter
    Email
    Click to copy link
    Link copied
    Close
    Cite
    Niek Tax (2020). Validation of Precision Measures - Event Logs and Process Models [Dataset]. http://doi.org/10.4121/uuid:991753f7-a240-4ba6-a8a8-67174a08c51b
    Explore at:
    zipAvailable download formats
    Dataset updated
    Jul 28, 2020
    Dataset provided by
    4TU.ResearchData
    Authors
    Niek Tax
    License

    https://doi.org/10.4121/resource:terms_of_usehttps://doi.org/10.4121/resource:terms_of_use

    Description

    This collection contains the event logs and process models described and used in the paper "The Imprecisions of Precision Measures in Process Mining"

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
Alberto (2023). Event logs for process mining [Dataset]. https://www.kaggle.com/datasets/carlosalvite/car-insurance-claims-event-log-for-process-mining
Organization logo

Event logs for process mining

Gain hands-on experience with a real car insurance claims use case.

Explore at:
387 scholarly articles cite this dataset (View in Google Scholar)
zip(4892593 bytes)Available download formats
Dataset updated
Apr 11, 2023
Authors
Alberto
License

http://opendatacommons.org/licenses/dbcl/1.0/http://opendatacommons.org/licenses/dbcl/1.0/

Description

Description This event log has been artificially generated and curated to provide a comprehensive view of car insurance claims, allowing users to discover and identify bottlenecks, automation opportunities, conformance issues, reworks, and potential fraudulent cases using any process mining software.

Looking for more event logs, use cases and training material?

👉 Join our Skool community

You'll find a hands-on training course and realistic set of 9+ event logs.

Standard Process flow: “First Notification of Loss (FNOL)” -> “Assign Claim” -> “Claim Decision” -> “Set Reserve” -> “Payment Sent” -> “Close Claim”

Attributes: - case ID - activity name - timestamp - claimant name - agent name - adjuster name - claim amount - claimant age - type of policy - car make - car model - car year - date and time of the accident - type of accident - user type

Total number of claims: 30,000

Dates: Claims belong to years 2020, 2021, and 2022.

Disclaimer: Personal names are fake.

Search
Clear search
Close search
Google apps
Main menu