Community examples#

In addition to the curated list of official Flax examples on GitHub, there is a growing community of people using Flax to build new types of machine learning models. We are happy to showcase any example built by the community here!

If you want to submit your own Flax example, you can start by forking one of the official Flax examples on GitHub.

Models#

Link

Author

Task type

Reference

matthias-wright/flaxmodels

@matthias-wright

Various

GPT-2, ResNet, StyleGAN-2, VGG, …

DarshanDeshpande/jax-models

@DarshanDeshpande

Various

Segformer, Swin Transformer, … also some stand-alone layers

google/vision_transformer

@andsteing

Image classification, image/text

https://arxiv.org/abs/2010.11929, https://arxiv.org/abs/2105.01601, https://arxiv.org/abs/2111.07991, …

jax-resnet

@n2cholas

Various resnet implementations

torch.hub

Wav2Vec2 finetuning

@vasudevgupta7

Automatic Speech Recognition

https://arxiv.org/abs/2006.11477

Tutorials#

Link

Author

Task type

Reference

Contributing policy#

If you are interested in adding a project to the Community Examples section, take the following into consideration:

  • Code examples: Examples for must contain a README that is helpful, clear, and explains how to run the code. The code itself should be easy to follow.

  • Tutorials: These docs should preferrably be a Jupyter Notebook format (refer to Contributing to learn how to convert a Jupyter Notebook into a Markdown file with jupytext). Your tutorial should be well-written, and discuss/decsribe an interesting topic/task. To avoid duplication, the content of these docs must be different from existing docs on the Flax documentation site or other community examples mentioned in this document.

  • Models: repositories with models ported to Flax must provide at least one of the following:

    • Metrics that are comparable to the original work when the model is trained to completion. Having available plots of the metric’s history during training is highly encouraged.

    • Tests to verify numerical equivalence against a well known implementation (same inputs + weights = same outputs) preferably using pretrained weights.

In all cases mentioned above, the code must work with the latest stable versions of the following packages: jax, flax, and optax, and make substantial use of Flax. Note that both jax and optax are required packages of flax (refer to the installation instructions for more details).