From Signal Processing to Machine Learning 10 New Activation Patterns, 2000-2006 20 30 40 50 60 10 20 30 40 50 60 Outline 1. A Message from #2 2. A History Lesson 3. A Birthday Present Greetings from Yi Wan • “I treasure many fond memories while being your student. I still remember the lovely way you smile.” • “One day a visiting faculty from Tsinghua Univiersity (considered the best engineering school in China) came to my office. After he learned that I was your student, he highly praised you and said that I must be somebody because of your fame .” Backdrop • • • • 1995: PhD, UW 1995-96: postdoc, Rice 1996-99: Asst. Prof. at MSU Focus on “classical” signal and image processing (filtering, wavelets, multiscale methods) • 1999: Hired by Rice as an Assistant Professor New Millenium, New Horizons (and New Looks) • Network Science • Inverse Problems • Machine Learning Computer Vision? • TEMPLAR: TEMPlate Learning from Atomic Representations Dyadic, Coarse-to-Fine Thinking • Function approximation Set boundary approximation Dyadic, Coarse-to-Fine Thinking • • • • • • Classification (Clay) Density estimation (Becca) Density level sets (Aarti, Clay) Regression level sets (Becca) Active learning (Rui, Becca) Semi-supervised learning (Aarti) Influences • David Donoho (wedgelets, CART/best basis) • Andrew Barron (complexity regularization, sieves) • Polonik/Tsybakov (rate conditions) Contributions • Emphasis on approximation error • Optimal rates of convergence • Adaptivity Lessons Learned • • • • Analysis of estimation error (CORT) Distributional assumptions Problems of interest Limitations of dyadic thinking Q&A • More influences, contributions, or lessons learned? • What are the most important legacies of this phase of Rob’s career? (for the field, for us as individuals) • How has your background in SP helped you address ML problems? (or vice versa) • What are the keys to successfully entering a new field? Birthday Present Linear Preference Model Suboptimality of Preference Learning
© Copyright 2025 Paperzz