Efficient Alternatives and Extensions to Deep-Learning-Based Solutions Slides and Transcript
Naveen Verma
-
SSCS
IEEE Members: $10.00
Non-members: $20.00Pages/Slides: 82
Abstract - Deep-learning systems have had profound impacts in a broad range of applications. However, it is important to remember that these represent only one class of
machine learning. In this segment of the short course, we start by probing what the critical attributes are of deep learning, and what challenges in modeling and
inference they solve. We then go on to consider the limitations of deep learning in emerging applications involving on-line learning (e.g. reinforcement learning
with embedded sensors) – namely the need for a large number of training instances and the need for very low energy. This motivates alternatives or extensions to
deep learning, which make use of other forms of learning to enhance training and energy efficiency. Given the need for very low energy in many applications, we
explore how the statistical-learning can enable new hardware architectures, substantially overcoming the tradeoffs limiting conventional architectures for sensing
and computation. Finally, having examined how algorithmic techniques can enhance systems, we look at how systems, and emerging technologies for sensing,
can enhance algorithms. As an illustration, we consider how object-associated sensing, as enabled by IoT devices, has the potential to provide semantic structure,
leading to features that can enhance the generalization of learning with simpler and easier-to-train models.
Bio - Naveen Verma received the B.A.Sc. degree in Electrical and Computer Engineering from the University of British Columbia, Vancouver, Canada in 2003, and the
M.S. and Ph.D. degrees in Electrical Engineering from the Massachusetts Institute of Technology in 2005 and 2009 respectively. Since July 2009 he has been with
the department of Electrical Engineering at Princeton University, where he is currently an Associate Professor. His research focuses on advanced sensing systems,
including low-voltage digital logic and SRAMs, low-noise analog instrumentation and data-conversion, large-area sensing systems based on flexible electronics,
and low-energy algorithms for embedded inference, especially for medical applications. Prof. Verma is a Distinguished Lecturer of the IEEE Solid-State Circuits
Society, and serves on the technical program committees for ISSCC, VLSI Symp., DATE, and the IEEE Signal-Processing Society (DISPS). Prof. Verma is a recipient
or co-recipient of the 2006 DAC/ISSCC Student Design Contest Award, the 2008 ISSCC Jack Kilby Paper Award, the 2012 Alfred Rheinstein Junior Faculty Award,
the 2013 NSF CAREER Award, the 2013 Intel Early Career Award, the 2013 Walter C. Johnson Prize for Teaching Excellence, the 2013 VLSI Symp. Best Student
Paper Award, the 2014 AFOSR Young Investigator Award, the 2015 Princeton Engineering Council Excellence in Teaching Award, and the 2015 IEEE Trans. CPMT
Best Paper Award.
machine learning. In this segment of the short course, we start by probing what the critical attributes are of deep learning, and what challenges in modeling and
inference they solve. We then go on to consider the limitations of deep learning in emerging applications involving on-line learning (e.g. reinforcement learning
with embedded sensors) – namely the need for a large number of training instances and the need for very low energy. This motivates alternatives or extensions to
deep learning, which make use of other forms of learning to enhance training and energy efficiency. Given the need for very low energy in many applications, we
explore how the statistical-learning can enable new hardware architectures, substantially overcoming the tradeoffs limiting conventional architectures for sensing
and computation. Finally, having examined how algorithmic techniques can enhance systems, we look at how systems, and emerging technologies for sensing,
can enhance algorithms. As an illustration, we consider how object-associated sensing, as enabled by IoT devices, has the potential to provide semantic structure,
leading to features that can enhance the generalization of learning with simpler and easier-to-train models.
Bio - Naveen Verma received the B.A.Sc. degree in Electrical and Computer Engineering from the University of British Columbia, Vancouver, Canada in 2003, and the
M.S. and Ph.D. degrees in Electrical Engineering from the Massachusetts Institute of Technology in 2005 and 2009 respectively. Since July 2009 he has been with
the department of Electrical Engineering at Princeton University, where he is currently an Associate Professor. His research focuses on advanced sensing systems,
including low-voltage digital logic and SRAMs, low-noise analog instrumentation and data-conversion, large-area sensing systems based on flexible electronics,
and low-energy algorithms for embedded inference, especially for medical applications. Prof. Verma is a Distinguished Lecturer of the IEEE Solid-State Circuits
Society, and serves on the technical program committees for ISSCC, VLSI Symp., DATE, and the IEEE Signal-Processing Society (DISPS). Prof. Verma is a recipient
or co-recipient of the 2006 DAC/ISSCC Student Design Contest Award, the 2008 ISSCC Jack Kilby Paper Award, the 2012 Alfred Rheinstein Junior Faculty Award,
the 2013 NSF CAREER Award, the 2013 Intel Early Career Award, the 2013 Walter C. Johnson Prize for Teaching Excellence, the 2013 VLSI Symp. Best Student
Paper Award, the 2014 AFOSR Young Investigator Award, the 2015 Princeton Engineering Council Excellence in Teaching Award, and the 2015 IEEE Trans. CPMT
Best Paper Award.
Primary Committee:
SSCS