Text Analytics Toolbox Model for BERT-Small Network
Pretrained BERT-Small Network for MATLAB.
64 Downloads
Updated
11 Sep 2024
BERT-Small is a pretrained language model based on Deep Learning Transformer architecture that can be used for a wide variety of Natural Language Processing (NLP) tasks. This model has 4 self-attention layers and a hidden size of 512.
To load a BERT-Small model, you can run the following code:
[net, tokenizer] = bert(Model="small");
MATLAB Release Compatibility
Created with
R2023b
Compatible with R2023b to R2024b
Platform Compatibility
Windows macOS (Apple silicon) macOS (Intel) LinuxTags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!Discover Live Editor
Create scripts with code, output, and formatted text in a single executable document.