Solving Optimization Problems with Maximum Satisfiability - Encodings and Re-Encodings
Presented By: Jeremias Berg
Abstract: NP-hard combinatorial optimization problems are encountered in numerous different domains including machine learning, data analysis and artificial intelligence. As such efficient methods for solving instances of these problems can save time, money, and other resources in several different applications. In this talk I give an overview of my PhD research. During my PhD, I studied exact declarative approaches to solving difficult optimization problems within the Maximum Satisfiability (MaxSAT) paradigm. More specifically I developed MaxSAT solving techniques in the form of solver independent preprocessing, in the talk I will discuss effective integration of MaxSAT preprocessing and solving as well as the overall effect that preprocessing has on MaxSAT solving. I also developed MaxSAT encodings of two important data analysis tasks: correlation clustering and bounded treewidth Bayesian network structure learning, in the talk I will briefly discuss the encodings and show how, on many benchmarks, the MaxSAT-based approach is faster and more memory efficient than other exact solution methods.
Dr. Jeremias Berg is a postdoctoral researcher at the Constraint Reasoning and Optimization group at the University of Helsinki. After graduating with distinction in June 2018, he is visiting different venues (Melbourne in the fall of 2018 and Toronto now) around the world in order to expand and strengthen his network of potential collaborators and gain experiences in different research environments. He earned his M.S. in Statistical Machine Learning and B.S. in Mathematics from the University of Helsinki. His webpage can be found at www.jeremiasberg.com