Skip to content

Releases: rinikerlab/DASH-tree

DASH-Tree J. Chem. Phys. published

23 Feb 10:18
45ca497

Choose a tag to compare

DASH Tree

This release corresponds to the code used to generate the results of the submission of the publication:

DASH properties: Estimating atomic and molecular properties from a dynamic attention-based substructure hierarchy

Marc T. Lehner, Paul Katzberger, Niels Maeder, Gregory A. Landrum, and Sereina Riniker
J. Chem. Phys. 161, 074103 (2024)
DOI: 10.1063/5.0218154

Abstract

Recently, we presented a method to assign atomic partial charges based on the DASH (dynamic attention-based substructure hierarchy) tree with high efficiency and quantum mechanical (QM)-like accuracy. In addition, the approach can be considered “rule based”—where the rules are derived from the attention values of a graph neural network—and thus, each assignment is fully explainable by visualizing the underlying molecular substructures. In this work, we demonstrate that these hierarchically sorted substructures capture the key features of the local environment of an atom and allow us to predict different atomic properties with high accuracy without building a new DASH tree for each property. The fast prediction of atomic properties in molecules with the DASH tree can, for example, be used as an efficient way to generate feature vectors for machine learning without the need for expensive QM calculations. The final DASH tree with the different atomic properties as well as the complete dataset with wave functions is made freely available.

DASH-Tree_JCIM_published

09 Oct 18:40
1100c80

Choose a tag to compare

DASH Tree

This release corresponds to the code used to generate the results of the submission of the publication:

DASH: Dynamic Attention-Based Substructure Hierarchy for Partial Charge Assignment

Marc T. Lehner, Paul Katzberger, Niels Maeder, Carl C.G. Schiebroek, Jakob Teetz, Gregory A. Landrum, and Sereina Riniker
Journal of Chemical Information and Modeling 2023 63 (19), 6014-6028
DOI: 10.1021/acs.jcim.3c00800

Abstract

We present a robust and computationally efficient approach for assigning partial charges of atoms in
molecules. The method is based on a hierarchical tree constructed from attention values extracted
from a graph neural network (GNN), which was trained to predict atomic partial charges from accurate
quantum-mechanical (QM) calculations. The resulting dynamic attention-based substructure hierarchy
(DASH) approach provides fast assignment of partial charges with the same accuracy as the GNN itself,
is software-independent, and can easily be integrated in existing parametrization pipelines as shown for
the Open force field (OpenFF). The implementation of the DASH workflow, the final DASH tree, and
the training set are available as open source / open data from public repositories.

DASH-Tree arXiv submission

26 May 14:38
fa02269

Choose a tag to compare

DASH Tree

This release corresponds to the code used to generate the results of the submission of the publication:

DASH: Dynamic Attention-Based Substructure Hierarchy for Partial Charge Assignment

Marc T. Lehner, Paul Katzberger, Niels Maeder, Carl C. G. Schiebroek, Jakob Teetz, Gregory A. Landrum and Sereina Riniker
https://doi.org/10.48550/arXiv.2305.15981

Abstract

We present a robust and computationally efficient approach for assigning partial charges of atoms in
molecules. The method is based on a hierarchical tree constructed from attention values extracted
from a graph neural network (GNN), which was trained to predict atomic partial charges from accurate
quantum-mechanical (QM) calculations. The resulting dynamic attention-based substructure hierarchy
(DASH) approach provides fast assignment of partial charges with the same accuracy as the GNN itself,
is software-independent, and can easily be integrated in existing parametrization pipelines as shown for
the Open force field (OpenFF). The implementation of the DASH workflow, the final DASH tree, and
the training set are available as open source / open data from public repositories.