Saved datasets
Last updated
Download format
Usage rights
License from data provider
Please review the applicable license to make sure your contemplated use is permitted.
Topic
Provider
Free
Cost to access
Described as free to access or have a license that allows redistribution.
4 datasets found
  1. E

    PAN Wikipedia Vandalism Corpus 2010 (PAN-WVC-10)

    • live.european-language-grid.eu
    txt
    Updated Apr 26, 2024
    + more versions
  2. PAN Wikipedia Vandalism Corpus 2010 (PAN-WVC-10)

    • zenodo.org
    • data.niaid.nih.gov
    zip
    Updated Jan 24, 2020
  3. W

    Webis-WVC-07

    • webis.de
    3341473
    Updated 2007
  4. Z

    Webis Wikipedia Vandalism Corpus (Webis-WVC-07)

    • data.niaid.nih.gov
    • zenodo.org
    Updated Jan 24, 2020
  5. Not seeing a result you expected?
    Learn how you can add new datasets to our index.

Share
FacebookFacebook
TwitterTwitter
Email
Click to copy link
Link copied
Close
Cite
(2024). PAN Wikipedia Vandalism Corpus 2010 (PAN-WVC-10) [Dataset]. https://live.european-language-grid.eu/catalogue/corpus/7557

PAN Wikipedia Vandalism Corpus 2010 (PAN-WVC-10)

Explore at:
3 scholarly articles cite this dataset (View in Google Scholar)
txtAvailable download formats
Dataset updated
Apr 26, 2024
License

Attribution 4.0 (CC BY 4.0)https://creativecommons.org/licenses/by/4.0/
License information was derived automatically

Description

The PAN Wikipedia Vandalism Corpus 2010 (PAN-WVC-10) is a corpus for the evaluation of automatic vandalism detectors for Wikipedia. For research purposes the corpus can be used free of charge.

This corpus is supplemented by the PAN-WVC-11, which features additional edits in English, Spanish and German. Both corpora should be used to get more representative results.

As part of our research on automatic vandalism detection we have compiled a corpus of vandalism cases found in Wikipedia. The corpus compiles 32452 edits on 28468 Wikipedia articles, among which 2391 vandalism edits have been identified. To annotate the corpus we have used Amazon's Mechanical Turk; 753 workers have been recruited who cast more than 150000 votes on the edits, so that each edit was reviewed by at least 3 annotators. The achieved level of agreement was analyzed in order to label an edit as "regular" or "vandalism."

Search
Clear search
Close search
Google apps
Main menu