Deep Neural Networks

Deep Learning (DL) is a state of the art approach in machine learning and data mining science. Since 2006 DL has been successfully applied in a wide range of practical problems including, image understanding, speech recognition, word segmentation, topic segmentation and recognition and etc.

DL is a composition of consecutive layers of signal processing units wherein the data is tried to be reoriented in its most abstract representation based on some given specified targets. This representation of data would give us the data patterns and structures which relieve the most informative aspects of the input dataset representing the output targets, which in general is the main goal of any pattern recognition method and data-mining model.

The nature of the DL models and their training algorithms (in most of the cases is stochastic gradient descent (sgd)) let us take advantage of the graphical processing units (GPU) power and parallel processing algorithms in order to train the model in minimum processing time and optimized calculation complexity. This cutting edge of hardware and algorithm combination gives us the ability to face big practical problems in data mining and pattern analysis with a high value of efficiency.

DL projects in our group can be divided into two branches

  • Practical implementation of DL: The most popular discussions on Deep Neural Networks (DNN) are their usage on practical classification problems. Using powerful GPUs and fast algorithms, the group managed to work on emotion detection and gender detection, and also design an implementation of auto-encoders.
visual representation of a DNN

Fig.1 Deep Neural Network

 

  • Neural Network Analysis: Understanding the whole idea of how neural networks work needs one to take a long look at the data information flow inside the network. Understanding the stochastic processes inside the neural network using information theory and Information bottleneck approach can lead us to design the optimum neural network given a training dataset. The other project within our group is to understand the processes inside the neural networks using information theory tools.