PySpark分布式机器学习与大数据分析(Distributed Machine Learning and Big Data Analysis with PySpark)

From cslt Wiki
Revision as of 23:11, 2 March 2016 by Fanmiao (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Apache Spark is an open source cluster computing framework. Originally developed at the University of California, Berkeley, the Spark codebase was later donated to the Apache Software Foundation that has maintained it since. Spark provides an interface for programming entire clusters with implicit data parallelism and fault-tolerance.