[Back]


Contributions to Proceedings:

M. Thiessen, T. Gärtner:
"Active Learning of Convex Halfspaces on Graphs";
in: "Advances in Neural Information Processing Systems 34", Advances in Neural Information Processing Systems 34, 2021.



English abstract:
We systematically study the query complexity of learning geodesically convex halfspaces on graphs. Geodesic convexity is a natural generalisation of Euclidean convexity and allows the definition of convex sets and halfspaces on graphs. We prove an upper bound on the query complexity linear in the treewidth and the minimum hull set size but only logarithmic in the diameter. We show tight lower bounds along well-established separation axioms and identify the Radon number as a central parameter of the query complexity and the VC dimension. While previous bounds typically depend on the cut size of the labelling, all parameters in our bounds can be computed from the unlabelled graph. We provide evidence that ground-truth communities in real-world graphs are often convex and empirically compare our proposed approach with other active learning algorithms.

Keywords:
active learning, learning theory, semi-supervised learning, transduction, vertex classification, graphs, convexity theory, geodesic convexity, shortest paths, halfspaces, query complexity


Electronic version of the publication:
https://proceedings.neurips.cc/paper/2021/hash/c4bf1e24f3e6f92ca9dfd9a7a1a1049c-Abstract.html


Created from the Publication Database of the Vienna University of Technology.