EnSyth: A Pruning Approach to Synthesis of Deep Learning Ensembles

Alhalabi, Besher and Gaber, Mohamed Medhat and Basurra, Shadi (2019) EnSyth: A Pruning Approach to Synthesis of Deep Learning Ensembles. In: IEEE International Conference on Systems, Man, and Cybernetics (IEEE SMC 2019), 06-09 October, 2019, Bari, Italy.

[img] Text
EnSyth__SMC2019__CameraReday.pdf - Accepted Version
Restricted to Registered users only

Download (554kB)

Abstract

Deep neural networks have achieved state-of-art performance in many domains including computer vision, natural language processing and self-driving cars. However, they are very computationally expensive and memory intensive which raises significant challenges when it comes to deploy or train them on strict latency applications or resource-limited environments. As a result, many attempts have been introduced to accelerate and compress deep learning models, however the majority were not able to maintain the same accuracy of the baseline models. In this paper, we describe EnSyth, a deep learning ensemble approach to enhance the predictability of compact neural network’s models. First, we generate a set of diverse compressed deep learning models using different hyperparameters for a pruning method, after that we utilise ensemble learning to synthesise the outputs of the compressed models to compose a new pool of classifiers. Finally, we apply backward elimination on the generated pool to explore the best performing combinations of models. On CIFAR-10, CIFAR-5 data-sets with LeNet-5, EnSyth outperforms the predictability of the baseline model.

Item Type: Conference or Workshop Item (Paper)
Date: 22 July 2019
Subjects: G700 Artificial Intelligence
Divisions: Faculty of Computing, Engineering and the Built Environment > School of Computing and Digital Technology > Enterprise Systems
Depositing User: Mohamed Gaber
Date Deposited: 28 Jul 2019 14:46
Last Modified: 03 Jul 2020 12:25
URI: http://www.open-access.bcu.ac.uk/id/eprint/7781

Actions (login required)

View Item View Item

Research

In this section...