24
I Use This!
Low Activity
Analyzed 4 days ago. based on code collected 4 days ago.

Project Summary

Apache Mahout's goal is to build scalable machine learning libraries. With scalable we mean:
Scalable to reasonably large data sets. Our core algorithms for clustering, classfication and batch based collaborative filtering are implemented on top of Apache Hadoop using the map/reduce paradigm. However we do not restrict contributions to Hadoop based implementations: Contributions that run on a single node or on a non-Hadoop cluster are welcome as well. The core libraries are highly optimized to allow for good performance also for non-distributed algorithms

Tags

algorithms classifiers clustering collaborative_filtering datamining data_mining dimension_reduction distributed distributed_computing hadoop java library machinelearning machine_learning mapreduce recommender regression

In a Nutshell, Apache Mahout...

Apache License 2.0
Permitted

Commercial Use

Modify

Distribute

Place Warranty

Private Use

Use Patent Claims

Sub-License

Forbidden

Use Trademarks

Hold Liable

Required

Include Copyright

Include License

State Changes

Include Notice

These details are provided for information only. No information here is legal advice and should not be used as such.

All Licenses

This Project has No vulnerabilities Reported Against it

Did You Know...

  • ...
    Black Duck offers a free trial so you can discover if there are open source vulnerabilities in your code
  • ...
    check out hot projects on the Open Hub
  • ...
    there are over 3,000 projects on the Open Hub with security vulnerabilities reported against them
  • ...
    compare projects before you chose one to use

Languages

Languages?height=75&width=75
Java
77%
Scala
13%
9 Other
10%

30 Day Summary

Sep 18 2018 — Oct 18 2018

12 Month Summary

Oct 18 2017 — Oct 18 2018
  • 55 Commits
    Down -166 (75%) from previous 12 months
  • 6 Contributors
    Down -10 (62%) from previous 12 months