Documentation

matlab.compiler.mlspark.SparkContext class

Package: matlab.compiler.mlspark
Superclasses:

Interface class to initialize a connection to a Spark enabled cluster

Description

A SparkContext object serves as an entry point to Spark™ by initializing a connection to a Spark cluster. It accepts a SparkConf object as an input argument and uses the parameters specified in that object to set up the internal services necessary to establish a connection to the Spark execution environment.

Construction

sc = matlab.compiler.mlspark.SparkContext(conf) creates a SparkContext object initializes a connection to a Spark cluster.

Input Arguments

expand all

Pass the SparkConf object as input to the SparkContext.

Example: sc = matlab.compiler.mlspark.SparkContext(conf);

See matlab.compiler.mlspark.SparkConf for information on how to create a SparkConf object.

Properties

The properties of this class are hidden.

Methods

addJarAdd JAR file dependency for all tasks that need to be executed in a SparkContext
broadcastBroadcast a read-only variable to the cluster
datastoreToRDDConvert MATLAB datastore to a Spark RDD
deleteShutdown connection to Spark enabled cluster
getSparkConfGet SparkConf configuration parameters
parallelizeCreate an RDD from a collection of local MATLAB values
setCheckpointDirSet the directory under which RDDs are to be checkpointed
setLogLevelSet log level
textFileCreate an RDD from a text file

Examples

collapse all

The SparkContext class initializes a connection to a Spark enabled cluster using Spark properties.

% Setup Spark Properties as a containers.Map object
sparkProp = containers.Map({'spark.executor.cores'}, {'1'}); 

% Create SparkConf object
conf = matlab.compiler.mlspark.SparkConf('AppName','myApp', ...
                        'Master','local[1]','SparkProperties',sparkProp);
% Create a SparkContext
sc = matlab.compiler.mlspark.SparkContext(conf);

More About

expand all

References

See the latest Spark documentation for more information.

Introduced in R2016b