Skip to main navigation
Skip to Content
Computer Science
University of Toronto
U of T Portal
Student Support
Contact
About
Why Study CS at U of T
Career Options
History of DCS
Giving to DCS
Computer Science at UofT Mississauga
Computer Science at UofT Scarborough
Contact
Employment Opportunities for Faculty/Lecturers
How to Find Us
Undergraduate
Prospective Undergraduates
Current Undergraduates
Graduate
Prospective Graduate students
Current Graduate students
Research
Research Areas
Partner with us
People
Faculty
Staff
In Memoriam
Alumni and Friends
Honours & Awards
Women in Computer Science
Graduate Student Society
Undergraduate Student Union
Undergraduate Artificial Intelligence Group
Undergraduate Theory Group
News & Events
News
Events
Newsletter: @DCS Update
Alumni
Donate
You are viewing: >
Home
>
News & Events
>
Events
> MARCH 19: Theory CS Seminar
About
Undergraduate
Graduate
Research
People
News & Events
MARCH 19: Theory CS Seminar
Event date: Friday, March 19, 2010, at 11:10 AM
Location: Galbraith Bldg, Rm. 244
Speaker: Andrew Wan
Columbia University
Title: Learning and the average case
Abstract: We study several well-known, open problems in computational learning theory from an average-case perspective, in which the success of the learning algorithm is measured according to fixed distributions over the examples it sees and over the functions that label them. With this perspective we obtain new positive and negative results.
In the positive direction, we address the open and notoriously difficult problem of learning DNF formulas. We discover a variety of new structural properties of "most" DNF formulas: more specifically, we show that the Fourier spectra of random monotone and non-monotone DNF formulas have a number of features which make learning in several models possible. Our techniques extend to show that the spectra of other subclasses of DNF formulas, such as read-k DNF formulas, have these features as well.
In the other direction, the average-case perspective lets us apply existing techniques from complexity theory to answer a fundamental question about the learnability of monotone functions. Over the years, a range of positive algorithmic results have been obtained for learning various classes of monotone Boolean functions. To date, the only negative result for learning monotone functions is an information-theoretic lower bound showing hardness for certain superpolynomial-size circuits. We establish cryptographic hardness of learning polynomial-size monotone circuits by applying a complexity-theoretic approach to hardness-amplification pioneered by O'Donnell (JCSS 04).