Machine Learning for Information Networks
Speaker: Oliver Schulte, Simon Fraser University
Abstract: Information networks provide information about different types of entities, different types of links between these entities, and attributes of both entities and links. Information networks are ubiquitous, maintained by many organizations in a relational database. This talk presents challenges and solutions for learning Bayes nets from information network data. Bayes nets are a widely used model class that represent correlations and causal relationships graphically. I define a novel semantics for first-order Bayes nets based on the classic random selection semantics of Bacchus and Halpern for probabilistic logic. The theoretical basis of our Bayes net structure learning algorithm is a new method for defining provably consistent model selection scores for information networks (e.g. BIC, BDeu). The talk describes some of the statistical-relational applications supported by Bayes nets, such as density estimation, classification, and anomaly detection for information networks.
Biography: Oliver Schulte is a Professor in the School of Computing Science at Simon Fraser University, Vancouver, Canada. He received a BSc from the University of Toronto in 1992, and a PhD from Carnegie Mellon University in 1997. His current research focuses on machine learning for structured data, such as relational and event data. He has given several tutorials on relational learning (www.aaai.org/Conferences/AAAI/2017/aaai17tutorials.php#SUA2). His publications in leading AI and machine learning venues address a variety of topics, including learning Bayesian networks, relational learning, and computational logic. While he has won some nice awards, his biggest claim to fame may be a draw against chess world champion Gary Kasparov.