Learning Multiple Tasks over Multiple Paths
Time: Fri 2022-11-11 11.00 - 12.00
Location: Malvinas väg 10, floor 7, Harry Nyquist
Video link: https://kth-se.zoom.us/j/67493021792
Language: English
Participating: Samet Oymak, Electrical and Computer Engineering, University of California, Riverside
Abstract: Conventional multitask learning (MTL) methods build a shared representation across tasks. A desirable refinement of using a shared one-fits-all representation is to construct task-specific representations. To this end, recent PathNet/muNet architectures represent individual tasks as pathways within a larger supernet. The subnetworks induced by pathways can be viewed as task-specific representations that are composition of modules within supernet's computation graph. This work explores the pathways proposal from the lens of statistical learning: We first develop generalization bounds for empirical risk minimization problems learning multiple tasks over multiple paths (Multipath MTL). In conjunction, we formalize the benefits of resulting multipath representation when adapting to new downstream tasks. Our bounds are expressed in terms of Gaussian complexity, capture modularity of supernet, and lead to tangible guarantees for parametric representations. These reveal theoretical insights into benefits of a multipath representation (e.g. when is Multipath MTL superior to traditional MTL) and how over-parameterized supernets are important for ensuring representational fairness across tasks.
Bio: Samet Oymak is an assistant professor of Electrical and Computer Engineering at the University of California, Riverside. Prior to UCR, he spent three years at Google and in algorithmic finance. During his postdoc, he was at UC Berkeley as a Simons Fellow and AMPLab member. He obtained his PhD degree from Caltech in 2015 for which he received a Charles Wilts Prize for the best departmental thesis. At UCR, he received an NSF CAREER award as well as a Research Scholar award from Google. Website: intra.ece.ucr.edu/~oymak