Main Content

matlab.compiler.mlspark.SparkContext Class

Namespace: matlab.compiler.mlspark

Interface class to initialize a connection to a Spark enabled cluster


A SparkContext object serves as an entry point to Spark™ by initializing a connection to a Spark cluster. It accepts a SparkConf object as an input argument and uses the parameters specified in that object to set up the internal services necessary to establish a connection to the Spark execution environment.


sc = matlab.compiler.mlspark.SparkContext(conf) creates a SparkContext object initializes a connection to a Spark cluster.

Input Arguments

expand all

Pass the SparkConf object as input to the SparkContext.

Example: sc = matlab.compiler.mlspark.SparkContext(conf);

See matlab.compiler.mlspark.SparkConf for information on how to create a SparkConf object.


The properties of this class are hidden.


addJarAdd JAR file dependency for all tasks that need to be executed in a SparkContext
broadcastBroadcast a read-only variable to the cluster
datastoreToRDDConvert MATLAB datastore to a Spark RDD
deleteShutdown connection to Spark enabled cluster
getSparkConfGet SparkConf configuration parameters
parallelizeCreate an RDD from a collection of local MATLAB values
setCheckpointDirSet the directory under which RDDs are to be checkpointed
setLogLevelSet log level
textFileCreate an RDD from a text file


collapse all

The SparkContext class initializes a connection to a Spark enabled cluster using Spark properties.

% Setup Spark Properties as a containers.Map object
sparkProp = containers.Map({'spark.executor.cores'}, {'1'}); 

% Create SparkConf object
conf = matlab.compiler.mlspark.SparkConf('AppName','myApp', ...
% Create a SparkContext
sc = matlab.compiler.mlspark.SparkContext(conf);

More About

expand all


See the latest Spark documentation for more information.

Version History

Introduced in R2016b