0
I Use This!
Inactive
Analyzed 4 days ago. based on code collected 4 days ago.

Project Summary

Suite of Python scripts for parsing mediawiki dumps (Wikipedia and similar sites) and extract interesting information, for example to extract a (social) network from User Talk pages, collect history of contributions from users, analyze the history of editing to traumatic event pages, extract emotions from wiki texts.

Tags

No tags have been added

In a Nutshell, Wikipedia networks...

Quick Reference

GNU General Public License v2.0 or later
Permitted

Commercial Use

Modify

Distribute

Place Warranty

Forbidden

Sub-License

Hold Liable

Required

Distribute Original

Disclose Source

Include Copyright

State Changes

Include License

These details are provided for information only. No information here is legal advice and should not be used as such.

All Licenses

This Project has No vulnerabilities Reported Against it

Did You Know...

  • ...
    Black Duck offers a free trial so you can discover if there are open source vulnerabilities in your code
  • ...
    compare projects before you chose one to use
  • ...
    in 2016, 47% of companies did not have formal process in place to track OS code
  • ...
    learn about Open Hub updates and features on the Open Hub blog

Languages

Languages?height=75&width=75
Python
93%
5 Other
7%

30 Day Summary

Jan 17 2018 — Feb 16 2018

12 Month Summary

Feb 16 2017 — Feb 16 2018

Ratings

Be the first to rate this project
Click to add your rating
   Spinner
Review this Project!