gbt-HIPS: Explaining the Classifications of Gradient Boosted Tree Ensembles

Hatwell, Julian and Gaber, Mohamed Medhat and Azad, R. Muhammad Atif (2021) gbt-HIPS: Explaining the Classifications of Gradient Boosted Tree Ensembles. Applied Sciences, 11 (6). p. 2511. ISSN 2076-3417

applsci-11-02511.pdf - Published Version
Available under License Creative Commons Attribution.

Download (1MB)


This research presents Gradient Boosted Tree High Importance Path Snippets (gbt-HIPS), a novel, heuristic method for explaining gradient boosted tree (GBT) classification models by extracting a single classification rule (CR) from the ensemble of decision trees that make up the GBT model. This CR contains the most statistically important boundary values of the input space as antecedent terms. The CR represents a hyper-rectangle of the input space inside which the GBT model is, very reliably, classifying all instances with the same class label as the explanandum instance. In a benchmark test using nine data sets and five competing state-of-the-art methods, gbt-HIPS offered the best trade-off between coverage (0.16–0.75) and precision (0.85–0.98). Unlike competing methods, gbt-HIPS is also demonstrably guarded against under- and over-fitting. A further distinguishing feature of our method is that, unlike much prior work, our explanations also provide counterfactual detail in accordance with widely accepted recommendations for what makes a good explanation.

Item Type: Article
Identification Number:
6 March 2021Accepted
11 March 2021Published
Uncontrolled Keywords: explainable artificial intelligence; human-understandable AI systems; gradient boosting; black box problem; machine learning interpretability
Subjects: CAH11 - computing > CAH11-01 - computing > CAH11-01-01 - computer science
CAH11 - computing > CAH11-01 - computing > CAH11-01-05 - artificial intelligence
Divisions: Faculty of Computing, Engineering and the Built Environment > School of Computing and Digital Technology
Depositing User: Atif Azad
Date Deposited: 12 Mar 2021 11:03
Last Modified: 12 Jan 2022 12:58

Actions (login required)

View Item View Item


In this section...