v9.1.0

  • Optimize memory when training. The whole dataset is added to the CPU memory. The Dataset is loaded to GPU/CPU memory when batch training. Thus, less GPU memory is required.

  • Optimize the dataset management so that one can manipulate the dataset in memory, instead of I/O on disk.

    • Deprecate LoadDataset and Collater in agat/data/load_dataset.py.

    • Return a new AGAT Dataset object when indexing a Dataset with int, slice, tuple, and list.

    • A Dataset is returned of ReadGraphs, BuildDatabase, concat_graphs, concat_dataset, and select_graphs_from_dataset_random.

    • Add __repr__ method to Dataset.

    • Add save method to Dataset.

  • Optimize transfer learning. agat/model/fit.py

  • Optimize agat/app/calculators.py, .../calculators.py, and .../ensembles.py, and deprecate agat/app/app.py

  • Add CrystalGraph and AseGraphTorch to agat/data/build_graph.py.

  • Update documentation.

v9.0.1

v9.0.0

Note: AGAT after this version (included) cannot load the well-trained model before. If you need to do so, please use v8.0.5: https://pypi.org/project/agat/8.0.5/

v8.0.3

v8.0.3

v8.0.0

  • Convert TensorFlow to PyTorch backend.

  • Updata docs.

v7.14.0

v7.13.4

v7.13.3

  • Using self-defined tf-based functions to calculate Pearson r: agat/lib/GatLib.py#L248-L259

    This self-defined function can handle ValueError: array must not contain infs or NaNs.

  • Fix a bug: bug

  • Clip optimizer grads: clipnorm=1.0

v7.13.2

v7.13.1

v7.13

v7.12.2

v7.12.1

v7.12

  • Release pip wheel.

  • Simplify packages. See v1.0.0 for more details of the first release.

v1.0.0 DOI

First release to reproduce results and support conclusions of Design High-Entropy Electrocatalyst via Interpretable Deep Graph Attention Learning.