Loading Events

VASC Seminar

September

16
Mon
Miguel Á. Carreira-Perpiñán Associate professor UC Merced
Monday, September 16
3:00 pm to 4:00 pm
Learning nested systems using auxiliary coordinates

Event Location: NSH 1507
Bio: Miguel Á. Carreira-Perpiñán is an associate professor in Electrical Engineering and Computer Science at the University of California, Merced. He received the degree of “licenciado en informática” (MSc in computer science) from the Technical University of Madrid in 1995 and a PhD in computer science from the University of Sheffield in 2001. Prior to joining UC Merced, he did postdoctoral work at Georgetown University (in computational neuroscience) and the University of Toronto (in machine learning), and was an assistant professor at the Oregon Graduate Institute (Oregon Health & Science University). He is the recipient of an NSF CAREER award, a Google Faculty Research Award and a best student paper award at Interspeech. He is an associate editor for the IEEE Transactions on Pattern Analysis and Machine Intelligence and an area chair for NIPS. His research interests lie in machine learning, in particular unsupervised learning problems such as dimensionality reduction, clustering and denoising, with an emphasis on optimization aspects, and with applications to speech processing (e.g. articulatory inversion and model adaptation), computer vision, sensor networks and other areas.

Abstract: Intelligent processing of complex signals such as images, sound or language is often performed by a parameterized hierarchy of nonlinear processing layers, sometimes biologically inspired. Hierarchical (or nested) systems offer a way to generate complex mappings using simple stages. Each layer performs a different operation and achieves an ever more sophisticated representation of the input, as, for example, in an deep artificial neural network, an object recognition cascade in computer vision or a speech front-end processing. Joint estimation of the parameters of all the layers and selection of an optimal architecture is widely considered to be a difficult numerical nonconvex optimization problem, difficult to parallelize for execution in a distributed computation environment, and requiring significant human expert effort, which leads to suboptimal systems in practice. We describe a general mathematical strategy to learn the parameters and, to some extent, the architecture of nested systems, called the method of auxiliary coordinates (MAC). This replaces the original problem involving a deeply nested function with a constrained problem involving a different function in an augmented space without nesting. The constrained problem may be solved with penalty-based methods using alternating optimization over the parameters of each layer, and over the auxiliary coordinates. MAC has provable convergence, is easy to implement reusing existing algorithms for single layers, can be parallelized trivially and massively, applies even when parameter derivatives are not available or not desirable (so computing gradients with the chain rule does not apply), and is competitive with state-of-the-art nonlinear optimizers even in the serial computation setting, often providing reasonable models within a few iterations.

If time permits, I will illustrate how to use MAC to derive training algorithms for a range of problems, such as deep nets, best-subset feature selection, joint dictionary and classifier learning, supervised dimensionality reduction, and others