Biography:Yee Whye Teh

From HandWiki
Short description: Artificial-intelligence researcher
Yee-Whye Teh
Alma materUniversity of Waterloo (BMath)
University of Toronto (PhD)
Known forHierarchical Dirichlet process
Deep belief networks
Scientific career
FieldsMachine learning
Artificial intelligence
Statistics
Computer science[1]
InstitutionsUniversity of Oxford
DeepMind
University College London
University of California, Berkeley
National University of Singapore[2]
ThesisBethe free energy and contrastive divergence approximations for undirected graphical models (2003)
Doctoral advisorGeoffrey Hinton[3]
Website{{{1}}}

Yee-Whye Teh is a professor of statistical machine learning in the Department of Statistics, University of Oxford.[4][5] Prior to 2012 he was a reader at the Gatsby Charitable Foundation computational neuroscience unit at University College London.[6] His work is primarily in machine learning, artificial intelligence, statistics and computer science.[1][7]

Education

Teh was educated at the University of Waterloo and the University of Toronto where he was awarded a PhD in 2003 for research supervised by Geoffrey Hinton.[3][8]

Research and career

Teh was a postdoctoral fellow at the University of California, Berkeley and the National University of Singapore before he joined University College London as a lecturer.[2]

Teh was one of the original developers of deep belief networks[9] and of hierarchical Dirichlet processes.[10]

Awards and honours

Teh was a keynote speaker at Uncertainty in Artificial Intelligence (UAI) 2019, and was invited to give the Breiman lecture at the Conference on Neural Information Processing Systems (NeurIPS) 2017.[11] He served as program co-chair of the International Conference on Machine Learning (ICML) in 2017, one of the premier conferences in machine learning.[4]

References

  1. 1.0 1.1 {{Google Scholar id}} template missing ID and not present in Wikidata.
  2. 2.0 2.1 "Yee-Whye Teh, Professor of Statistical Machine Learning". https://www.stats.ox.ac.uk/people/yee-whye-teh. 
  3. 3.0 3.1 Yee Whye Teh at the Mathematics Genealogy Project
  4. 4.0 4.1 {{{1}}}
  5. Gram-Hansen, Bradley (2021). Extending probabilistic programming systems and applying them to real-world simulators. ox.ac.uk (DPhil thesis). University of Oxford. OCLC 1263818188. EThOS uk.bl.ethos.833365.
  6. Gasthaus, Jan Alexander (2020). Hierarchical Bayesian nonparametric models for power-law sequences. ucl.ac.uk (PhD thesis). University College London. OCLC 1197757196. EThOS uk.bl.ethos.807804. Free to read
  7. {{DBLP}} template missing ID and not present in Wikidata.
  8. Whye Teh, Yee (2003). Bethe free energy and contrastive divergence approximations for undirected graphical models. utoronto.ca (PhD thesis). University of Toronto. hdl:1807/122253. OCLC 56683361. ProQuest 305242430.
  9.  , Wikidata Q33996665
  10.  , Wikidata Q77688418
  11. "On Bayesian Deep Learning and Deep Bayesian Learning". https://nips.cc/Conferences/2017/Schedule?showEvent=8726.