Skip to content

Probabilistic programming with programmable inference for parallel accelerators.

License

Notifications You must be signed in to change notification settings

genjax-dev/genjax

Folders and files

NameName
Last commit message
Last commit date

Latest commit

2dbdb9c Â· Apr 15, 2025
Apr 14, 2025
Apr 15, 2025
Apr 14, 2025
Mar 26, 2025
Oct 10, 2022
Nov 30, 2023
Mar 19, 2023
Mar 16, 2025
Apr 14, 2025
Mar 6, 2025
Oct 10, 2022
Apr 15, 2025
Sep 11, 2022
Apr 15, 2025
Jan 20, 2024
Mar 6, 2024
Apr 15, 2025
Sep 11, 2022
Feb 4, 2025
Apr 15, 2025
May 14, 2024
Apr 15, 2025

Scaling probabilistic programming with programmable inference.

PyPI codecov Ruff Public API: beartyped Discord Shield

Documentation Build status

🔎 What is GenJAX?

Gen is a multi-paradigm (generative, differentiable, incremental) language for probabilistic programming focused on generative functions: computational objects which represent probability measures over structured sample spaces.

GenJAX is an implementation of Gen on top of JAX - exposing the ability to programmatically construct and manipulate generative functions, as well as JIT compile + auto-batch inference computations using generative functions onto GPU devices.

Tip

GenJAX is part of a larger ecosystem of probabilistic programming tools based upon Gen. Explore more...

Quickstart

To install GenJAX, run

pip install genjax

Then install JAX using this guide to choose the command for the architecture you're targeting. To run GenJAX without GPU support:

pip install jax[cpu]~=0.4.24

On a Linux machine with a GPU, run the following command:

pip install jax[cuda12]~=0.4.24

Quick example Open In Colab

The following code snippet defines a generative function called beta_bernoulli that

  • takes a shape parameter beta
  • uses this to create and draw a value p from a Beta distribution
  • Flips a coin that returns 1 with probability p, 0 with probability 1-p and returns that value

Then, we create an inference problem (by specifying a posterior target), and utilize sampling importance resampling to give produce single sample estimator of p.

We can JIT compile that entire process, run it in parallel, etc - which we utilize to produce an estimate for p over 50 independent trials of SIR (with K = 50 particles).

import jax
import jax.numpy as jnp
import genjax
from genjax import beta, flip, gen, Target, ChoiceMap
from genjax.inference.smc import ImportanceK

# Create a generative model.
@gen
def beta_bernoulli(α, β):
    p = beta(α, β) @ "p"
    v = flip(p) @ "v"
    return v

@jax.jit
def run_inference(obs: bool):
    # Create an inference query - a posterior target - by specifying
    # the model, arguments to the model, and constraints.
    posterior_target = Target(beta_bernoulli, # the model
                              (2.0, 2.0), # arguments to the model
                              ChoiceMap.d({"v": obs}), # constraints
                            )

    # Use a library algorithm, or design your own - more on that in the docs!
    alg = ImportanceK(posterior_target, k_particles=50)

    # Everything is JAX compatible by default.
    # JIT, vmap, to your heart's content.
    key = jax.random.key(314159)
    sub_keys = jax.random.split(key, 50)
    _, p_chm = jax.vmap(alg.random_weighted, in_axes=(0, None))(
        sub_keys, posterior_target
    )

    # An estimate of `p` over 50 independent trials of SIR (with K = 50 particles).
    return jnp.mean(p_chm["p"])

(run_inference(True), run_inference(False))
(Array(0.6039314, dtype=float32), Array(0.3679334, dtype=float32))

References

Many bits of knowledge have gone into this project -- you can find many of these bits at the MIT Probabilistic Computing Project page under publications. Here's an abbreviated list of high value references:

JAX influences

This project has several JAX-based influences. Here's an abbreviated list:

Acknowledgements

The maintainers of this library would like to acknowledge the JAX and Oryx maintainers for useful discussions and reference code for interpreter-based transformation patterns.

Disclaimer

This is a research project. Expect bugs and sharp edges. Please help by trying out GenJAX, reporting bugs, and letting us know what you think!

Get Involved + Get Support

Pull requests and bug reports are always welcome! Check out our Contributor's Guide for information on how to get started contributing to GenJAX.

The TL;DR; is:

  • send us a pull request,
  • iterate on the feedback + discussion, and
  • get a +1 from a maintainer

in order to get your PR accepted.

Issues should be reported on the GitHub issue tracker.

If you want to discuss an idea for a new feature or ask us a question, discussion occurs primarily in the body of Github Issues

Created and maintained by the MIT Probabilistic Computing Project. All code is licensed under the Apache 2.0 License.