Is it possible (yet) to implement a maxout activation "layer" in 2017b Deep Learning network?

4 views (last 30 days)
Maxout is an activation function that includes RELU and “leaky” RELUs as special cases, basically allowing for piecewise linear (planar/hyperplanar) activation functions. They seem to work better than either in a number of cases. Here’s a reference: Goodfellow, I. J., Warde-Farley, D., Mirza, M., Courville, A., & Bengio, Y. (2013). Maxout networks. arXiv preprint arXiv:1302.4389. https://arxiv.org/abs/1302.4389
Ultimately I’m interested in playing with architectures like this one, which use maxouts extensively:
Zhang, Y., Pezeshki, M., Brakel, P., Zhang, S., Bengio, C. L. Y., & Courville, A. (2017). Towards end-to-end speech recognition with deep convolutional neural networks. arXiv preprint arXiv:1701.02720. Speech recognition using convolutional nets with maxout activation. https://arxiv.org/abs/1701.02720
But I simply can’t see any way to fake a maxout activation in a convolutional network framework in 2017b. While I’m a Matlab vet (since Version 4, I think), I’m a total newbie to Matlab deep learning networks, so maybe I’m missing something. Any suggestions greatly appreciated.
-Terry Nearey

Answers (1)

Pankaj Wasnik
Pankaj Wasnik on 2 Jan 2018
Hi, You can try using https://github.com/yechengxi/LightNet, which is a bit simpler cnn toolbox where you can debug easily also it's easier to understand. You can try to implement the maxout layer by yourself. I am also trying the same. If I finish before you, I will share the code.
Regards, Pankaj Wasnik

Categories

Find more on Install Products in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!