This article describes TensorLayer, a modular Python wrapper library for TensorFlow allowing data scientists to streamline the development of complex deep learning systems. TensorLayer was released in September 2016 with a GitHub repo. A descriptive research paper followed in August 2017: TensorLayer: A Versatile Library for Efficient Deep Learning Development. Tensor Layer won the ACM SIGMM 2017 Best Open Source Software award. Here are the slides from the acceptance presentation. Let’s take a quick look at what TensorLayer has to offer data scientists.
To many data scientists, native TensorFlow itself is too raw. A helper layer makes the experience much more streamlined.
“TensorLayer not only provides the high-level layer abstraction similar with other libraries like Keras, but also provides an end-to-end workflow including rich data pre-processing, training, post-processing, serving modules and database management,” said Hao Dong, PhD student from Imperial College London and one of the TensorLayer contributors. “All this helps developers to build a whole learning system from experimental phase to product deployment by using only TensorLayer.”
A Review of TensorFlow Sugar Coatings compared TensorLayer with Keras, as well as other popular TensorFlow wrappers such as TFSlim, SugarTensor, TFlearn, PrettyTensor, etc. Some believe the layer abstraction of TensorLayer is inspired by Lasagne which uses Theano as back-end and has additional advanced layers for researchers.
The TensorLayer community has provided many good materials with which to learn TensorLayer: Examples, Tutorials, and also the Documentation. The GitHub repo also has a number of applications that benefit from TensorLayer including GANs, DRL, hyhyperparameter optimization and cross-validation, image transformation, and medical signal processing.
The two main contributors of TensorLayer, Hao Dong and Luo Mai, wrote up their early motivations to create a new TensorFlow wrapper and explain its place in the deep learning ecosystem:
“TensorLayer occupies a unique spot within the deep learning ecosystem. Most notably, there is TensorLayer’s focus on research. Even after the library grew beyond Hao’s lab, it has kept a closer relationship with the academic community than other frameworks. The majority of users, according to Hao, are university researchers — and they don’t only use the library, many of them also contribute their own code to it.”
The idea behind TensorLayer is to provide a modular approach to deep learning in order to address complexity and iterative tasks when it comes to large artificial neural networks (ANNs) and their interactions. TensorLayer lifts the low-level dataflow abstraction of TensorFlow to high-level layers. The APIs in TensorLayer strive for transparency in that there’s no attempt to shield Tensorflow from the users, but instead leaves accessible hooks that support low-level tuning. The layers of abstraction afforded by TensorLayer come with little overhead and as such it has the ability to achieve the performance of TensorFlow. The code base for TensorLayer is written in Python. TensorLayer relies on TensorFlow’s compute engine for training and uses MongoDB as the storage backend.
TensorLayer’s core follows a modular methodology as depicted in the architecture diagram below. The key deep learning elements addressed include building deep neural nets, implementing layers, managing data sets and structuring a data pipeline. The overarching feature of TensorLayer lies in its integrated approach where a collection of operations such as neural networks, their associated states, data and other parameters are arranged into modules within an abstraction level.
- Layer module – provides reference implementations of many commonly used neural network layers such as CNN, RNN, dropout, batch normalization, etc. You’re able to stack layers to create deep neural networks which are then delegated to TensorFlow.
- Model module – helps manage the intermediate states incurred throughout a model life-cycle – trained, evaluated, deployed. States can be persisted, cached and reloaded.
- Data set module – manages training data and prediction results. TensorLayer works to optimize data set performance using various techniques.
- Workflow module – supports asynchronous scheduling and failure recovery for concurrent training jobs.
TensorLayer represents yet another entry in the growing number of TensorFlow wrappers coming on the scene to help the productivity of data scientists. Although my personal experience has mostly been limited to Keras, I think TensorLayer deserves serious consideration, especially in light of its purposeful and continued relationship to the academic community.