[Back]


Talks and Poster Presentations (with Proceedings-Entry):

S. Ordyniak, S. Szeider:
"Parameterized Complexity of Small Decision Tree Learning";
Talk: 35th AAAI 2021, virtual event; 2021-02-02 - 2021-02-09; in: "Thirty-Fifth AAAI Conference on Artificial Intelligence", AAAI Press, 35 (2021), ISBN: 978-1-57735-866-4; 1 - 9.



English abstract:
We study the NP-hard problem of learning a decision tree
(DT) of smallest depth or size from data. We provide the first
parameterized complexity analysis of the problem and draw
a detailed parameterized complexity map for the natural parameters:
size or depth of the DT, maximum domain size of
all features, and the maximum Hamming distance between
any two examples. Our main result shows that learning DTs
of smallest depth or size is fixed-parameter tractable (FPT)
parameterized by the combination of all three of these parameters.
We contrast this FPT-result by various hardness results
that underline the algorithmic significance of the considered
parameters.


Electronic version of the publication:
https://publik.tuwien.ac.at/files/publik_300053.pdf


Created from the Publication Database of the Vienna University of Technology.