The Effects of Regularization on Learning Facial Expressions with Convolutional Neural Networks

Proceedings of the 25th International Conference on Artificial Neural Networks (ICANN 2016) pages 80--87, doi: 10.1007/978-3-319-44781-0_10 - Sep 2016
Associated documents :  
Convolutional neural networks (CNNs) have become effective instruments in facial expression recognition. Very good results can be achieved with deep CNNs possessing many layers and providing a good internal representation of the learned data. Due to the potentially high complexity of CNNs on the other hand they are prone to overfitting and as a result, regularization techniques are needed to improve the performance and minimize overfitting. However, it is not yet clear how these regularization techniques affect the learned representation of faces. In this paper we examine the effects of novel regularization techniques on the training and performance of CNNs and their learned features. We train a CNN using dropout, max pooling dropout, batch normalization and different combinations of these three. We show that a combination of these methods can have a big impact on the performance of a CNN, almost halving its validation error. A visualization technique is applied to the CNNs to highlight their activations for different inputs, illustrating a significant difference between a standard CNN and a regularized CNN.

 

@InProceedings{HBW16, 
 	 author =  {Hinz, Tobias and Barros, Pablo and Wermter, Stefan},  
 	 title = {The Effects of Regularization on Learning Facial Expressions with Convolutional Neural Networks}, 
 	 booktitle = {Proceedings of the 25th International Conference on Artificial Neural Networks (ICANN 2016)},
 	 number = {},
 	 volume = {},
 	 pages = {80--87},
 	 year = {2016},
 	 month = {Sep},
 	 publisher = {Springer},
 	 doi = {10.1007/978-3-319-44781-0_10}, 
 }