# lstm

Long short-term memory

## Syntax

## Description

The long short-term memory (LSTM) operation allows a network to learn long-term dependencies between time steps in time series and sequence data.

**Note**

This function applies the deep learning LSTM operation to `dlarray`

data. If
you want to apply an LSTM operation within a `layerGraph`

object
or `Layer`

array, use
the following layer:

applies a long short-term memory (LSTM) calculation to input `dlY`

= lstm(`dlX`

,`H0`

,`C0`

,`weights`

,`recurrentWeights`

,`bias`

)`dlX`

using
the initial hidden state `H0`

, initial cell state `C0`

,
and parameters `weights`

, `recurrentWeights`

, and
`bias`

. The input `dlX`

must be a formatted
`dlarray`

. The output `dlY`

is a formatted
`dlarray`

with the same dimension format as `dlX`

,
except for any `'S'`

dimensions.

The `lstm`

function updates the cell and hidden states using the
hyperbolic tangent function (tanh) as the state activation function. The
`lstm`

function uses the sigmoid function given by $$\sigma (x)={(1+{e}^{-x})}^{-1}$$ as the gate activation function.

`[`

also returns the hidden state and cell state after the LSTM operation.`dlY`

,`hiddenState`

,`cellState`

] = lstm(`dlX`

,`H0`

,`C0`

,`weights`

,`recurrentWeights`

,`bias`

)

`[___] = lstm(___,'DataFormat',`

also specifies the dimension format `FMT`

)`FMT`

when `dlX`

is
not a formatted `dlarray`

. The output `dlY`

is an
unformatted `dlarray`

with the same dimension order as
`dlX`

, except for any `'S'`

dimensions.

## Examples

## Input Arguments

## Output Arguments

## Limitations

`functionToLayerGraph`

does not support the`lstm`

function. If you use`functionToLayerGraph`

with a function that contains the`lstm`

operation, the resulting`LayerGraph`

contains placeholder layers.

## More About

## Extended Capabilities

## See Also

`dlarray`

| `fullyconnect`

| `softmax`

| `dlgradient`

| `dlfeval`

| `gru`

**Introduced in R2019b**