Biography:Sepp Hochreiter
Sepp Hochreiter | |
---|---|
Hochreiter in 2012 | |
Born | Mühldorf, West Germany | 14 February 1967
Nationality | Germany |
Alma mater | Technische Universität München |
Scientific career | |
Fields | Machine learning, bioinformatics |
Institutions | Johannes Kepler University Linz |
Thesis | Generalisierung bei neuronalen Netzen geringer Komplexität (1999) |
Doctoral advisor | Wilfried Brauer |
Josef "Sepp" Hochreiter (born 14 February 1967) is a German computer scientist. Since 2018 he has led the Institute for Machine Learning at the Johannes Kepler University of Linz after having led the Institute of Bioinformatics from 2006 to 2018. In 2017 he became the head of the Linz Institute of Technology (LIT) AI Lab. Hochreiter is also a founding director of the Institute of Advanced Research in Artificial Intelligence (IARAI).[1] Previously, he was at the Technical University of Berlin, at the University of Colorado at Boulder, and at the Technical University of Munich. He is a chair of the Critical Assessment of Massive Data Analysis (CAMDA) conference.[2]
Hochreiter has made contributions in the fields of machine learning, deep learning and bioinformatics, most notably the development of the long short-term memory (LSTM) neural network architecture,[3][4] but also in meta-learning,[5] reinforcement learning[6][7] and biclustering with application to bioinformatics data.
Scientific career
Long short-term memory (LSTM)
Hochreiter developed the long short-term memory (LSTM) neural network architecture in his diploma thesis in 1991 leading to the main publication in 1997.[3][4] LSTM overcomes the problem of numerical instability in training recurrent neural networks (RNNs) that prevents them from learning from long sequences (vanishing or exploding gradient).[3][8][9] In 2007, Hochreiter and others successfully applied LSTM with an optimized architecture to very fast protein homology detection without requiring a sequence alignment.[10] LSTM networks have also been used in Google Voice for transcription[11] and search,[12] and in the Google Allo chat app for generating response suggestion with low latency.[13]
Other machine learning contributions
Beyond LSTM, Hochreiter has developed "Flat Minimum Search" to increase the generalization of neural networks[14] and introduced rectified factor networks (RFNs) for sparse coding[15][16] which have been applied in bioinformatics and genetics.[17] Hochreiter introduced modern Hopfield networks with continuous states[18] and applied them to the task of immune repertoire classification.[19]
Hochreiter worked with Jürgen Schmidhuber in the field of reinforcement learning on actor-critic systems that learn by "backpropagation through a model".[6][20]
Hochreiter has been involved in the development of factor analysis methods with application to bioinformatics, including FABIA for biclustering,[21] HapFABIA for detecting short segments of identity by descent[22] and FARMS for preprocessing and summarizing high-density oligonucleotide DNA microarrays to analyze RNA gene expression.[23]
In 2006, Hochreiter and others proposed an extension of the support vector machine (SVM), the "Potential Support Vector Machine" (PSVM),[24] which can be applied to non-square kernel matrices and can be used with kernels that are not positive definite. Hochreiter and his collaborators have applied PSVM to feature selection, including gene selection for microarray data.[25][26][27]
Awards
Hochreiter was awarded the IEEE CIS Neural Networks Pioneer Prize in 2021 for his work on LSTM.[28]
References
- ↑ "IARAI – INSTITUTE OF ADVANCED RESEARCH IN ARTIFICIAL INTELLIGENCE". https://www.iarai.ac.at/.
- ↑ "CAMDA 2021". http://www.camda.info/.
- ↑ 3.0 3.1 3.2 Hochreiter, S. (1991). Untersuchungen zu dynamischen neuronalen Netzen (PDF) (diploma thesis). Technical University Munich, Institute of Computer Science.
- ↑ 4.0 4.1 Hochreiter, S.; Schmidhuber, J. (1997). "Long Short-Term Memory". Neural Computation 9 (8): 1735–1780. doi:10.1162/neco.1997.9.8.1735. PMID 9377276.
- ↑ Hochreiter, S.; Younger, A. S.; Conwell, P. R. (2001). "Learning to Learn Using Gradient Descent". Artificial Neural Networks — ICANN 2001. Lecture Notes in Computer Science. 2130. pp. 87–94. doi:10.1007/3-540-44668-0_13. ISBN 978-3-540-42486-4. http://www.bioinf.jku.at/publications/older/1504.pdf.
- ↑ 6.0 6.1 Hochreiter, S. (1991). Implementierung und Anwendung eines neuronalen Echtzeit-Lernalgorithmus für reaktive Umgebungen (Report). http://www.bioinf.jku.at/publications/older/fopra.pdf.
- ↑ Arjona-Medina, J. A.; Gillhofer, M.; Widrich, M.; Unterthiner, T.; Hochreiter, S. (2018). "RUDDER: Return Decomposition for Delayed Rewards". arXiv:1806.07857 [cs.LG].
- ↑ Hochreiter, S. (1998). "The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions". International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems 06 (2): 107–116. doi:10.1142/S0218488598000094. ISSN 0218-4885.
- ↑ Hochreiter, S.; Bengio, Y.; Frasconi, P.; Schmidhuber, J. (2000). "Gradient flow in recurrent nets: the difficulty of learning long-term dependencies". in Kolen, J. F.; Kremer, S. C.. A Field Guide to Dynamical Recurrent Networks. New York City: IEEE Press. pp. 237–244.
- ↑ Hochreiter, S.; Heusel, M.; Obermayer, K. (2007). "Fast model-based protein homology detection without alignment". Bioinformatics 23 (14): 1728–1736. doi:10.1093/bioinformatics/btm247. PMID 17488755.
- ↑ "The neural networks behind Google Voice transcription". 11 August 2015. http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html.
- ↑ "Google voice search: faster and more accurate". 24 September 2015. http://googleresearch.blogspot.co.uk/2015/09/google-voice-search-faster-and-more.html.
- ↑ Khaitan, Pranav (May 18, 2016). "Chat Smarter with Allo" (in en). http://ai.googleblog.com/2016/05/chat-smarter-with-allo.html.
- ↑ Hochreiter, S.; Schmidhuber, J. (1997). "Flat Minima". Neural Computation 9 (1): 1–42. doi:10.1162/neco.1997.9.1.1. PMID 9117894.
- ↑ Clevert, D.-A.; Mayr, A.; Unterthiner, T.; Hochreiter, S. (2015). "Rectified Factor Networks". arXiv:1502.06464v2 [cs.LG].
- ↑ Clevert, D.-A.; Mayr, A.; Unterthiner, T.; Hochreiter, S. (2015). "Rectified Factor Networks". Advances in Neural Information Processing Systems 29.
- ↑ Clevert, D.-A.; Unterthiner, T.; Povysil, G.; Hochreiter, S. (2017). "Rectified factor networks for biclustering of omics data". Bioinformatics 33 (14): i59–i66. doi:10.1093/bioinformatics/btx226. PMID 28881961.
- ↑ Ramsauer, H.; Schäfl, B.; Lehner, J.; Seidl, P.; Widrich, M.; Gruber, L.; Holzleitner, M.; Pavlović, M.; Sandve, G. K.; Greiff, V.; Kreil, D.; Kopp, M.; Klambauer, G.; Brandstetter, J.; Hochreiter, S. (2020). "Hopfield Networks is All You Need". arXiv:2008.02217 [cs.NE].
- ↑ Widrich, M.; Schäfl, B.; Ramsauer, H.; Pavlović, M.; Gruber, L.; Holzleitner, M.; Brandstetter, J.; Sandve, G. K.; Greiff, V.; Hochreiter, S.; Klambauer, G. (2020). "Modern Hopfield Networks and Attention for Immune Repertoire Classification". arXiv:2007.13505 [cs.LG].
- ↑ Template:Cite tech report
- ↑ Hochreiter, Sepp; Bodenhofer, Ulrich; Heusel, Martin; Mayr, Andreas; Mitterecker, Andreas; Kasim, Adetayo; Khamiakova, Tatsiana; Van Sanden, Suzy et al. (2010-06-15). "FABIA: factor analysis for bicluster acquisition". Bioinformatics 26 (12): 1520–1527. doi:10.1093/bioinformatics/btq227. PMID 20418340.
- ↑ Hochreiter, S. (2013). "HapFABIA: Identification of very short segments of identity by descent characterized by rare variants in large sequencing data". Nucleic Acids Research 41 (22): e202. doi:10.1093/nar/gkt1013. PMID 24174545.
- ↑ Hochreiter, S.; Clevert, D.-A.; Obermayer, K. (2006). "A new summarization method for affymetrix probe level data". Bioinformatics 22 (8): 943–949. doi:10.1093/bioinformatics/btl033. PMID 16473874.
- ↑ Hochreiter, S.; Obermayer, K. (2006). "Support Vector Machines for Dyadic Data". Neural Computation 18 (6): 1472–1510. doi:10.1162/neco.2006.18.6.1472. PMID 16764511.
- ↑ Hochreiter, S.; Obermayer, K. (2006). Nonlinear Feature Selection with the Potential Support Vector Machine. Feature Extraction, Studies in Fuzziness and Soft Computing. pp. 419–438. doi:10.1007/978-3-540-35488-8_20. ISBN 978-3-540-35487-1.
- ↑ Hochreiter, S.; Obermayer, K. (2003). "Classification and Feature Selection on Matrix Data with Application to Gene-Expression Analysis". 54th Session of the International Statistical Institute. http://www.bioinf.jku.at/publications/bioinf/older/0804.ps.
- ↑ Hochreiter, S.; Obermayer, K. (2004). "Gene Selection for Microarray Data". Kernel Methods in Computational Biology (MIT Press): 319–355. http://www.bioinf.jku.at/publications/bioinf/older/0604.ps.
- ↑ "Sepp Hochreiter receives IEEE CIS Neural Networks Pioneer Award 2021 - IARAI". 24 July 2020. https://www.iarai.ac.at/news/sepp-hochreiter-receives-ieee-cis-neural-networks-pioneer-award-2021/.
External links
Original source: https://en.wikipedia.org/wiki/Sepp Hochreiter.
Read more |