Skip to content

A highly optimized library for building markov random fields with pytorch.

Notifications You must be signed in to change notification settings

dugu9sword/torch_random_fields

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

logo

torch_random_fields is a library for building markov random fields (MRF) with complex topology [1] [2] with pytorch, it is optimized for batch training on GPU.

The key features include:

  • Easy to plug into your research code
  • Support for batch acceleration of any random field with arbitary binary or ternary connections on the GPU
  • Fast training/inference with top-K logits, do not worry about too large label space
  • Support for context-aware transition matrix and low-rank factorization

You may cite this project by:

@inproceedings{
  wang2022regularized,
  title={Regularized Molecular Conformation Fields},
  author={Lihao Wang and Yi Zhou and Yiqun Wang and Xiaoqing Zheng and Xuanjing Huang and Hao Zhou},
  booktitle={Advances in Neural Information Processing Systems},
  editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
  year={2022},
  url={https://openreview.net/forum?id=7XCFxnG8nGS}
}

Cases

Linear-Chain CRF

Check out the tutorial.

The well known linear-chain CRF which is heavily adopted in sequence labeling (POS-tagging, chunking, NER, etc.) is supported.

logo

Top-K Skip-Chain CRF

Check out the tutorial.

In torch_random_fields, any random field with arbitary topology is supported. To be more precise, we require binary connections, although in some case ternary connections are also supported (yes, I am lazy).

Here we show a case of Dynamic Skip-Chain CRF, where:

  • Some nodes (e.g., two nodes with the same words) are connected, which looks skipping the linear connection [3]
  • Only the top-3 labels for each node are kept, greatly speeding up training and inference [4]

logo

Ising/Potts Model

Ising model (or Potts model) is widely used in statistical physics and computational biology [5]. In this case, the random variables form a grid, but it can be fully connected.

logo

Features

Learning

  • Linear-Chain CRF:
    • maximum likelihood estimation
    • structured perceptron
    • piecewise training
    • pseudo-likelihood
  • General CRF:
    • structured perceptron
    • piecewise training
    • pseudo-likelihood

Inference

  • Linear-Chain CRF:

    • viterbi decoding
    • batch loopy belief propagation
    • batch mean field variational inference
  • General CRF:

    • batch loopy belief propagation
    • naive mean field variational inference
    • batch naive mean field inference

Sampling

  • Gibbs Sampling

Acknowledgement

Some implementation borrows from these great projects with modifications:

Reference

[1] An Introduction to Conditional Random Fields (Sutton and McCallum, 2010)

[2] Graphical Models, Exponential Families, and Variational Inference (Wainwright and Jordan, 2008)

[3] A Skip-Chain Conditional Random Field for Ranking Meeting Utterances by Importance (Galley, 2006)

[4] Fast Structured Decoding for Sequence Models (Sun, 2020)

[5] Improved contact prediction in proteins: Using pseudolikelihoods to infer Potts models (Ekeberg, 2013)

About

A highly optimized library for building markov random fields with pytorch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published