Induction of Non-monotonic Logic Programs To Explain Statistical Learning Models

Farhad Shakerin
(The University of Texas at Dallas)

We present a fast and scalable algorithm to induce non-monotonic logic programs from statistical learning models. We reduce the problem of search for best clauses to instances of the High-Utility Itemset Mining (HUIM) problem. In the HUIM problem, feature values and their importance are treated as transactions and utilities respectively. We make use of TreeExplainer, a fast and scalable implementation of the Explainable AI tool SHAP, to extract locally important features and their weights from ensemble tree models. Our experiments with UCI standard benchmarks suggest a significant improvement in terms of classification evaluation metrics and running time of the training algorithm compared to ALEPH, a state-of-the-art Inductive Logic Programming (ILP) system.

In Bart Bogaerts, Esra Erdem, Paul Fodor, Andrea Formisano, Giovambattista Ianni, Daniela Inclezan, German Vidal, Alicia Villanueva, Marina De Vos and Fangkai Yang: Proceedings 35th International Conference on Logic Programming (Technical Communications) (ICLP 2019), Las Cruces, NM, USA, September 20-25, 2019, Electronic Proceedings in Theoretical Computer Science 306, pp. 379–388.
Published: 19th September 2019.

ArXived at: https://dx.doi.org/10.4204/EPTCS.306.51 bibtex PDF
References in reconstructed bibtex, XML and HTML format (approximated).
Comments and questions to: eptcs@eptcs.org
For website issues: webmaster@eptcs.org