上QQ阅读APP看书,第一时间看更新
SparkConf
SparkConf allows us to configure a Spark application. It sets various Spark parameters as key-value pairs, and so will usually create a SparkConf object with a SparkConf() constructor, which would then load values from the spark.* underlying Java system.
There are a few useful functions; for example, we can use the sets() function to set the configuration property. We can use the setMaster() function to set the master URL to connect to. We can use the setAppName() function to set the application name, and setSparkHome() in order to set the path where Spark will be installed on worker nodes.
You can learn more about SparkConf at https://spark.apache.org/docs/0.9.0/api/pyspark/pysaprk.conf.SparkConf-class.html.