Departmental Colloquium

Artem Chernikov
Maryland
Higher arity Vapnik–Chervonenkis theory and PAC learning
Abstract: Finite VC-dimension, a combinatorial property of families of sets, was discovered simultaneously in the 70's by Vapnik and Chervonenkis in probabilistic learning theory, and by Shelah in model theory (where it is called NIP). It plays an important role in several areas including machine learning, combinatorics, mathematical logic, functional analysis and topological dynamics. A higher arity generalization of VC-dimension for families of sets in n-fold product spaces (i.e. a bound on the sizes of n-dimensional boxes that can be shattered) is implicit in Shelah's work on n-dependent theories in model theory. Following some preliminary work in Chernikov, Palacin, Takeuchi '14, in Chernikov, Towsner '20 we developed aspects of higher-arity VC-theory, including a generalization of Haussler's packing lemma for families of sets (and real-valued functions) of bounded VC_n-dimension. Probably Approximately Correct (PAC) learning is a classical framework for mathematical analysis of machine learning, and PAC learnability is famously characterized by finite VC dimension. Generalizing this, we demonstrate that finite VC_n dimension characterizes higher arity PAC learning (PAC_n learning) in n-fold product spaces with respect to product measures introduced by Kobayashi, Kuriyama and Takeuchi '15. Joint work with Henry Towsner.
Friday October 17, 2025 at 3:00 PM in 636 SEO
Web Privacy Notice HTML 5 CSS FAE
UIC LAS MSCS > persisting_utilities > seminars >