Lifted Junction Tree Algorithm

We look at probabilistic relational formalisms where the domain objects are known. In these formalisms, standard approaches for inference use lifted variable elimination (LVE) to answer single queries, which leads to inefficiencies for multiple queries. To answer multiple queries efficiently, the well-known junction tree algorithm builds a cluster representation of the underlying model for faster query answering. To benefit from the idea behind junction trees in the relational setting, we have transferred the concept of lifting to the junction tree algorithm, presenting the lifted junction tree algorithm (LJT). LJT saves computations using a compact first-order cluster representation and LVE as a subroutine in its computations.

While before either the number of known objects has proven to limit the junction tree algorithm or the number of queries lead to inefficient repetition for LVE, LJT allows for efficient repeated inference by incorporating relational aspects of a model as well as avoiding duplicate calculations. In many settings, LJT even outperforms FOKC, another well-known algorithm for exact repeated inference. FOKC stands for first-order knowledge compilation, which solves a weighted first-order model counting problem by building a first-order circuit, in which FOKC computes weighted model counts.

So far, we have extended the original LJT

  • to include the lifting tool of counting to lift even more computations, 
  • to identify and prevent unnecessary groundings through an additional merging step called fusion,
  • to effectively handle evidence in a lifted manner, and
  • to answer conjunctive queries that span more than one cluster.

We also work on a dynamic version of LJT (LDJT) to handle sequential data, e.g., in the form of time series. For more information, please refer to LDJT. We further apply the lifting idea to extend LVE, LJT, and LDJT

  • to compute a most probable explanation (also known as total abduction), including safe MAP queries (partial abduction),
  • to answer parameterised queries (compact form of a conjunctive query with isomorphic query terms) for a lifted query answering, and
  • to compute a maximum expected utility.

Not only LVE works as a subroutine. LJT allows for any inference algorithms as a subroutine as long as an algorithm fulfils certain requirements for evidence handling and message passing. The subroutine algorithm determines what type of queries LJT can answer. We have published a version of LJT that combines LVE and FOKC as subroutines into LJTKC.

Implementation

A prototype implementation of LJT based on BLOG and the LVE implementation by Taghipour as well as some documentation is available:

The web pages around the implementation have been prepared by Moritz Hoffmann.

Publikationen

2019

  • Marcel Gehrke, Tanya Braun, Ralf Möller: Lifted Temporal Most Probable Explanation
    to be published in: Proceedings of the International Conference on Conceptual Structures 2019, 2019, Springer
    BibTeX
  • Tanya Braun, Marcel Gehrke: Inference in Statistical Relational AI
    to be published in: Proceedings of the International Conference on Conceptual Structures 2019, 2019, Springer
    BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Lifted Temporal Maximum Expected Utility
    to be published in: Proceedings of the 32nd Canadian Conference on Artificial Intelligence, Canadian AI 2019, 2019, Springer
    BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Uncertain Evidence for Probabilistic Relational Models
    to be published in: Proceedings of the 32nd Canadian Conference on Artificial Intelligence, Canadian AI 2019, 2019, Springer
    BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Relational Forward Backward Algorithm for Multiple Queries
    in: Proceedings of the 32nd International Florida Artificial Intelligence Research Society Conference (FLAIRS-19), 2019, AAAI Press
    BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller, Alexander Waschkau, Christoph Strumann, Jost Steinhäuser: Lifted Maximum Expected Utility
    in: Artificial Intelligence in Health, 2019, Springer International Publishing, p.131-141
    DOI BibTeX
  • Tanya Braun: StaRAI or StaRDB? - A Tutorial on Statistical Relational AI
    in: BTW-19 Proceedings Datenbanksysteme für Business, Technologie und Web - Workshopband, 2019, Gesellschaft für Informatik, p.263-266
    BibTeX
to top

2018

  • Marcel Gehrke, Tanya Braun, Ralf Möller: Answering Multiple Conjunctive Queries with the Lifted Dynamic Junction Tree Algorithm
    in: Proceedings of the AI 2018: Advances in Artificial Intelligence, 2018, Springer, p.543-555
    DOI BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
    in: Proceedings of the AI 2018: Advances in Artificial Intelligence, 2018, Springer, p.556-562
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Adaptive Inference on Probabilistic Relational Models
    in: AI 2018: Advances in Artificial Intelligence, 2018, Springer, p.487-500
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm
    in: Proceedings of KI 2018: Advances in Artificial Intelligence, 2018, Springer, p.24-37
    DOI BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Towards Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
    in: Proceedings of KI 2018: Advances in Artificial Intelligence, 2018, Springer, p.38-45
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Parameterised Queries and Lifted Query Answering
    in: IJCAI-18 Proceedings of the 27th International Joint Conference on Artificial Intelligence, 2018, International Joint Conferences on Artificial Intelligence Organization, p.4980-4986
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm
    in: 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018
    Website BibTeX:
    @inproceedings{BraMo18c,
      author = {Tanya Braun and Ralf M\"oller}, 
      title = {{Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm}}, 
      booktitle = {8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence}, 
      year = {2018},
      url = {https://arxiv.org/abs/1807.00743},
      notes = {Peer-reviewed}
    }
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
    in: 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Answering Hindsight Queries with Lifted Dynamic Junction Trees
    in: 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller, Alexander Waschkau, Christoph Strumann, Jost Steinhäuser: Towards Lifted Maximum Expected Utility
    in: Proceedings of the First Joint Workshop on Artificial Intelligence in Health in Conjunction with the 27th IJCAI, the 23rd ECAI, the 17th AAMAS, and the 35th ICML, 2018, CEUR-WS.org, CEUR Workshop Proceedings, Vol.2142, p.93-96
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Lifted Dynamic Junction Tree Algorithm
    in: Proceedings of the International Conference on Conceptual Structures, 2018, Springer, p.55-69
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Lifted Most Probable Explanation
    in: Proceedings of the International Conference on Conceptual Structures, 2018, Springer, p.39-54,
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Counting and Conjunctive Queries in the Lifted Junction Tree Algorithm - Extended Version
    in: Postproceedings of the 5th International Workshop on Graph Structures for Knowledge Representation and Reasoning, 2018, Springer, p.54-72,
    DOI BibTeX
to top

2017

  • Tanya Braun, Ralf Möller: Preventing Groundings and Handling Evidence in the Lifted Junction Tree Algorithm
    in: KI 2017: Advances in Artificial Intelligence. KI 2017 - 40th Annual German Conference on AI, Dortmund, Germany, September 25-29, 2017, 2017, Springer, LNCS, Vol.10505, p.85-98
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Counting and Conjunctive Queries in the Lifted Junction Tree Algorithm
    in: Graph Structures for Knowledge Representation and Reasoning - 5th International Workshop (GKR 2017), Melbourne, Australia, 2017, 21. August
    Website BibTeX
to top

2016

  • Tanya Braun, Ralf Möller: Lifted Junction Tree Algorithm
    in: KI 2016: Advances in Artificial Intelligence - 39th Annual German Conference on AI, Klagenfurt, Austria, September 26-30, 2016, 2016, Gerhard Friedrich, Malte Helmert, Franz Wotawa (Ed.), Springer, Lecture Notes in Computer Science, Vol.9904, p.30-42
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Lifted Junction Tree Algorithm
    IFIS, Universität zu Lübeck, 2016, Long version of the KI 2016 conference paper
    BibTeX
to top