Jubatus
Developer(s) | Nippon Telegraph and Telephone & Preferred Infrastructure |
---|---|
Stable release |
0.4.3
/ April 19, 2013 |
Development status | Active |
Written in | C++ |
Operating system | Linux |
Type | machine learning |
License | GNU Lesser General Public License 2.1 |
Website |
jubat |
Jubatus is an open source online machine learning and distributed computing framework that is developed at Nippon Telegraph and Telephone and Preferred Infrastructure. Jubatus has many features like classification, recommendation, regression, anomaly detection, and graph mining. It supports many client languages C++, Java, Ruby, and Python. Jubatus uses Iterative Parameter Mixture[1][2] for distributed machine learning.
Notable Features
Jubatus supports:
- Multi-classification algorithms:
- Recommendation algorithms using:
- Regression algorithms:
- Passive Aggressive
- feature extraction method for natural language:
References
- ↑ Ryan McDonald, K. Hall and G. Mann, Distributed Training Strategies for the Structured Perceptron, North American Association for Computational Linguistics (NAACL), 2010.
- ↑ Gideon Mann, R. McDonald, M. Mohri, N. Silberman, and D. Walker, Efficient Large-Scale Distributed Training of Conditional Maximum Entropy Models, Neural Information Processing Systems (NIPS), 2009.
- ↑ Crammer, Koby; Dekel, Ofer; Shalev-Shwartz, Shai; Singer, Yoram (2003). Online Passive-Aggressive Algorithms. Proceedings of the Sixteenth Annual Conference on Neural Information Processing Systems (NIPS).
- ↑ Koby Crammer and Yoram Singer. Ultraconservative online algorithms for multiclass problems. Journal of Machine Learning Research, 2003.
- ↑ Koby Crammer, Ofer Dekel, Joseph Keshet, Shai Shalev-Shwartz, Yoram Singer, Online Passive-Aggressive Algorithms. Journal of Machine Learning Research, 2006.
- ↑ Mark Dredze, Koby Crammer and Fernando Pereira, Confidence-Weighted Linear Classification, Proceedings of the 25th International Conference on Machine Learning (ICML), 2008
- ↑ Koby Crammer, Mark Dredze and Fernando Pereira, Exact Convex Confidence-Weighted Learning, Proceedings of the Twenty Second Annual Conference on Neural Information Processing Systems (NIPS), 2008
- ↑ Koby Crammer, Mark Dredze and Alex Kulesza, Multi-Class Confidence Weighted Algorithms, Empirical Methods in Natural Language Processing (EMNLP), 2009
- ↑ Koby Crammer, Alex Kulesza and Mark Dredze, Adaptive Regularization Of Weight Vectors, Advances in Neural Information Processing Systems, 2009
- ↑ Koby Crammer and Daniel D. Lee, Learning via Gaussian Herding, Neural Information Processing Systems (NIPS), 2010.
This article is issued from Wikipedia - version of the 3/25/2014. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.