Nando de Freitas writes:
We’re designing two machine learning (ML) courses at Oxford (introductory and advanced ML).
In doing this, we have many questions and wonder what your thoughts are on the following:
- Which do you think are the key optimization papers/ideas that should be covered.
- Which topics do you think are coolest things in ML?
- Which are the essential ideas, tools and approaches?
- Are there other courses you would recommend?
- Which are good resources for students to learn to code and apply convolutional nets? Theano? What are the key deep learning things to know first?
- Which are the best scalable classifiers? … pegasos .. etc.
- Which are the coolest applications that can be easily given as a programming exercise?
- What theory to teach? PAC? PAC-Bayes? CLTs?
- What are the best tutorials on sample complexity for ML?
- How much should we emphasize the trade-offs of computing/optimization-approximation-estimation.
- What are the ML algorithms mostly used in industry?
- What are the essential ML papers that every student should read?
- How much should we teach hashing, bloom filters & other sketches?
- What are the most introductory papers on learning causal models from data (or from interventions)?
- How much of nonparametrics ala DPs, CRPs, IBPs etc. ? When will these methods impact industry? Challenges?
- What are the best bootstrapping papers for the ML audience?
- What are the hottest MCMC sampers? I have some answers here ;)
- etc. etc.
We’d love diverse answers, short statements, long opinion articles, etc. The ultimate course in a way says something about what we think the term ML refers to.
I just want to say one word to you. Just one word. Stan.