1 min read

MathWorks Expands Deep Learning Capabilities in Release 2018b of the MATLAB and Simulink Product Families

MathWorks logo_2010 (3) (3)MathWorks today introduced Release 2018b of MATLAB and Simulink. The release contains significant enhancements for deep learning, along with new capabilities and bug fixes across the product families. The new Deep Learning Toolbox, which replaces Neural Network Toolbox, provides engineers and scientists with a framework for designing and implementing deep neural networks. Now, image processing, computer vision, signal processing, and systems engineers can use MATLAB to more easily design complex network architectures and improve the performance of their deep learning models.

MathWorks recently joined the ONNX community to demonstrate its commitment to interoperability, enabling collaboration between users of MATLAB and other deep learning frameworks. Using the new ONNX converter in R2018b, engineers can import and export models from supported frameworks such as PyTorch, MxNet, and TensorFlow. This interoperability enables models trained in MATLAB to be used in other frameworks. Similarly, models trained in other frameworks can be brought into MATLAB for tasks such as debugging, validation, and embedded deployment. In addition, R2018b provides a curated set of reference models that are accessible with a single line of code. Also, additional model importers enable use of models from Caffe and Keras-Tensorflow.  

“As deep learning becomes more prevalent across multiple industries, there is a need to make it broadly available, accessible, and applicable to engineers and scientists with varying specializations,” said David Rich, MATLAB marketing director, MathWorks. “Now, deep learning novices and experts can learn, apply, and conduct advanced research with MATLAB by using an integrated deep learning workflow from research to prototype to production.”

MathWorks continues to improve user productivity and ease of use for deep learning workflows in R2018b through:

  • The Deep Network Designer app, which enables users to create complex network architectures or modify complex pretrained networks for transfer learning

  • Improved network training performance beyond desktop capabilities by supporting cloud vendors with MATLAB Deep Learning Container on NVIDIA GPU Cloud and the MATLAB reference architectures for Amazon Web Services and Microsoft Azure

  • Broadened support for domain-specific workflows, including ground-truth labeling apps for audio, video, and application-specific datastores, making it easier and faster to work with large collections of data

In R2018b, GPU Coder continues to improve inference performance by supporting NVIDIA libraries and adding optimizations such as auto-tuning, layer fusion, and buffer minimization. In addition, deployment support has been added for Intel and ARM platforms using Intel MKL-DNN and ARM Compute Library.

Available immediately, R2018b includes updates to the MATLAB and Simulink product family, including new capabilities for code generation, signal processing and communications, and verification and validation. For information on all new capabilities and bug fixes to the MATLAB and Simulink product families, watch the R2018b Highlights video.