Documentation

This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English version of the page.

Note: This page has been translated by MathWorks. Click here to see
To view all translated materials including this page, select Country from the country navigator on the bottom of this page.

matlab.compiler.mlspark.SparkContext class

Package: matlab.compiler.mlspark
Superclasses:

Interface class to initialize a connection to a Spark enabled cluster

Description

A SparkContext object serves as an entry point to Spark™ by initializing a connection to a Spark cluster. It accepts a SparkConf object as an input argument and uses the parameters specified in that object to set up the internal services necessary to establish a connection to the Spark execution environment.

Construction

sc = matlab.compiler.mlspark.SparkContext(conf) creates a SparkContext object initializes a connection to a Spark cluster.

Input Arguments

expand all

Pass the SparkConf object as input to the SparkContext.

Example: sc = matlab.compiler.mlspark.SparkContext(conf);

See matlab.compiler.mlspark.SparkConf for information on how to create a SparkConf object.

Properties

The properties of this class are hidden.

Methods

addJarAdd JAR file dependency for all tasks that need to be executed in a SparkContext
broadcastBroadcast a read-only variable to the cluster
datastoreToRDDConvert MATLAB datastore to a Spark RDD
deleteShutdown connection to Spark enabled cluster
getSparkConfGet SparkConf configuration parameters
parallelizeCreate an RDD from a collection of local MATLAB values
setCheckpointDirSet the directory under which RDDs are to be checkpointed
setLogLevelSet log level
textFileCreate an RDD from a text file

Examples

collapse all

The SparkContext class initializes a connection to a Spark enabled cluster using Spark properties.

% Setup Spark Properties as a containers.Map object
sparkProp = containers.Map({'spark.executor.cores'}, {'1'}); 

% Create SparkConf object
conf = matlab.compiler.mlspark.SparkConf('AppName','myApp', ...
                        'Master','local[1]','SparkProperties',sparkProp);
% Create a SparkContext
sc = matlab.compiler.mlspark.SparkContext(conf);

More About

expand all

References

See the latest Spark documentation for more information.

Introduced in R2016b