Lifted Junction Tree Algorithm

We look at probabilistic relational formalisms where the domain objects are known. In these formalisms, standard approaches for inference use lifted variable elimination (LVE) to answer single queries, which leads to inefficiencies for multiple queries. To answer multiple queries efficiently, the well-known junction tree algorithm builds a cluster representation of the underlying model for faster query answering. To benefit from the idea behind junction trees in the relational setting, we have transferred the concept of lifting to the junction tree algorithm, presenting the lifted junction tree algorithm (LJT). LJT saves computations using a compact first-order cluster representation and LVE as a subroutine in its computations.

While before either the number of known objects has proven to limit the junction tree algorithm or the number of queries lead to inefficient repetition for LVE, LJT allows for efficient repeated inference by incorporating relational aspects of a model as well as avoiding duplicate calculations. In many settings, LJT even outperforms FOKC, another well-known algorithm for exact repeated inference. FOKC stands for first-order knowledge compilation, which solves a weighted first-order model counting problem by building a first-order circuit, in which FOKC computes weighted model counts.

So far, we have extended the original LJT

  • to include the lifting tool of counting to lift even more computations, 
  • to identify and prevent unnecessary groundings through an additional merging step called fusion,
  • to effectively handle evidence in a lifted manner, and
  • to answer conjunctive queries that span more than one cluster.

We also work on a dynamic version of LJT (LDJT) to handle sequential data, e.g., in the form of time series. For more information, please refer to LDJT. We further apply the lifting idea to extend LVE, LJT, and LDJT

  • to compute a most probable explanation (also known as total abduction), including safe MAP queries (partial abduction),
  • to answer parameterised queries (compact form of a conjunctive query with isomorphic query terms) for a lifted query answering, and
  • to compute a maximum expected utility.

Not only LVE works as a subroutine. LJT allows for any inference algorithms as a subroutine as long as an algorithm fulfils certain requirements for evidence handling and message passing. The subroutine algorithm determines what type of queries LJT can answer. We have published a version of LJT that combines LVE and FOKC as subroutines into LJTKC.

Publikationen

2018

  • Marcel Gehrke, Tanya Braun, Ralf Möller: Answering Multiple Conjunctive Queries with the Lifted Dynamic Junction Tree Algorithm
    wird veröffentlicht in: Proceedings of the 31st Australasian Joint Conference on Artificial Intelligence, 2018, Springer
    BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
    wird veröffentlicht in: Proceedings of the 31st Australasian Joint Conference on Artificial Intelligence, 2018, Springer
    BibTeX
  • Tanya Braun, Ralf Möller: Adaptive Inference on Probabilistic Relational Models
    wird veröffentlicht in: Proceedings of the 31st Australasian Joint Conference on Artificial Intelligence, 2018, Springer
    BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Towards Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
    in: Proceedings of KI 2018: Advances in Artificial Intelligence, 2018, Springer
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm
    in: Proceedings of KI 2018: Advances in Artificial Intelligence, 2018, Springer, p.24-37
    DOI BibTeX:
    @inproceedings{BraMo18d,
      author = {Tanya Braun and Ralf M\"oller}, 
      title = {{Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm}}, 
      booktitle = {Proceedings of {KI} 2018: Advances in Artificial Intelligence}, 
      year = {2018},
      pages = {24--37},
      publisher = {Springer},
      doi = {https://doi.org/10.1007/978-3-030-00111-7_3}
    }
  • Tanya Braun, Ralf Möller: Parameterised Queries and Lifted Query Answering
    in: IJCAI-18 Proceedings of the 27th International Joint Conference on Artificial Intelligence, 2018, International Joint Conferences on Artificial Intelligence Organization, p.4980-4986
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm
    in: 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Preventing Unnecessary Groundings in the Lifted Dynamic Junction Tree Algorithm
    in: 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Answering Hindsight Queries with Lifted Dynamic Junction Trees
    in: 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller, Alexander Waschkau, Christoph Strumann, Jost Steinhäuser: Towards Lifted Maximum Expected Utility
    in: Proceedings of the First Joint Workshop on Artificial Intelligence in Health in Conjunction with the 27th IJCAI, the 23rd ECAI, the 17th AAMAS, and the 35th ICML, 2018, CEUR-WS.org, CEUR Workshop Proceedings, Vol.2142, p.93-96
    Website BibTeX
  • Marcel Gehrke, Tanya Braun, Ralf Möller: Lifted Dynamic Junction Tree Algorithm
    in: Proceedings of the International Conference on Conceptual Structures, 2018, Springer, p.55-69
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Lifted Most Probable Explanation
    in: Proceedings of the International Conference on Conceptual Structures, 2018, Springer, p.39-54,
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Counting and Conjunctive Queries in the Lifted Junction Tree Algorithm - Extended Version
    in: Postproceedings of the 5th International Workshop on Graph Structures for Knowledge Representation and Reasoning, 2018, Springer
    DOI BibTeX
nach oben

2017

  • Tanya Braun, Ralf Möller: Preventing Groundings and Handling Evidence in the Lifted Junction Tree Algorithm
    in: KI 2017: Advances in Artificial Intelligence. KI 2017 - 40th Annual German Conference on AI, Dortmund, Germany, September 25-29, 2017, 2017, Springer, LNCS, Vol.10505, p.85-98
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Counting and Conjunctive Queries in the Lifted Junction Tree Algorithm
    in: Graph Structures for Knowledge Representation and Reasoning - 5th International Workshop (GKR 2017), Melbourne, Australia, 2017, 21. August
    Website BibTeX
nach oben

2016

  • Tanya Braun, Ralf Möller: Lifted Junction Tree Algorithm
    in: KI 2016: Advances in Artificial Intelligence - 39th Annual German Conference on AI, Klagenfurt, Austria, September 26-30, 2016, 2016, Gerhard Friedrich, Malte Helmert, Franz Wotawa (Ed.), Springer, Lecture Notes in Computer Science, Vol.9904, p.30-42
    DOI BibTeX
  • Tanya Braun, Ralf Möller: Lifted Junction Tree Algorithm
    IFIS, Universität zu Lübeck, 2016, Long version of the KI 2016 conference paper
    BibTeX
nach oben