logo
  • Getting Started
  • Guides
    • JAX for the Impatient
    • Flax Basics
    • Managing Parameters and State
    • setup vs compact
    • Model Surgery
    • Extracting intermediate values
    • Learning Rate Scheduling
    • Ensembling on multiple devices
    • Processing the entire Dataset
  • Examples
    • Launching jobs on Google Cloud
    • ImageNet Classification
    • Language Modeling (lm1b)
    • MNIST Classification
    • Part-of-Speech Tagging
    • Predicting Biological Activities of Molecules with Graph Neural Networks
    • Proximal Policy Optimization
    • Seq2Seq: Addition
    • SST-2 Classification
    • Basic VAE
    • Machine Translation
  • Advanced Topics
    • Dealing with Module Arguments
    • The Module lifecycle
    • Lifted Transformations
    • Convert PyTorch Models to Flax
    • Upgrading my Codebase to Optax
    • Upgrading my Codebase to Linen
    • Linen Design Principles
    • FLIPs
    • How to Contribute
    • The Flax Philosophy
    • FLIPs
  • API Reference
    • flax.linen package
      • flax.linen.enable_named_call
      • flax.linen.disable_named_call
      • flax.linen.override_named_call
      • flax.linen.tabulate
      • flax.linen.vmap
      • flax.linen.scan
      • flax.linen.jit
      • flax.linen.remat
      • flax.linen.remat_scan
      • flax.linen.map_variables
      • flax.linen.jvp
      • flax.linen.vjp
      • flax.linen.custom_vjp
      • flax.linen.while_loop
      • flax.linen.cond
      • flax.linen.switch
      • flax.linen.Dense
      • flax.linen.DenseGeneral
      • flax.linen.Conv
      • flax.linen.ConvTranspose
      • flax.linen.ConvLocal
      • flax.linen.Embed
      • flax.linen.BatchNorm
      • flax.linen.LayerNorm
      • flax.linen.GroupNorm
      • flax.linen.max_pool
      • flax.linen.avg_pool
      • flax.linen.pool
      • flax.linen.celu
      • flax.linen.elu
      • flax.linen.gelu
      • flax.linen.glu
      • flax.linen.log_sigmoid
      • flax.linen.log_softmax
      • flax.linen.relu
      • flax.linen.sigmoid
      • flax.linen.soft_sign
      • flax.linen.softmax
      • flax.linen.softplus
      • flax.linen.swish
      • flax.linen.PReLU
      • flax.linen.Sequential
      • flax.linen.dot_product_attention_weights
      • flax.linen.dot_product_attention
      • flax.linen.make_attention_mask
      • flax.linen.make_causal_mask
      • flax.linen.SelfAttention
      • flax.linen.MultiHeadDotProductAttention
      • flax.linen.Dropout
      • flax.linen.LSTMCell
      • flax.linen.OptimizedLSTMCell
      • flax.linen.GRUCell
    • flax.serialization package
    • flax.core.frozen_dict package
    • flax.struct package
    • flax.jax_utils package
    • flax.traceback_util package
    • flax.traverse_util package
    • flax.training package
    • flax.config package
    • flax.error package
Theme by the Executable Book Project
  • .rst

Advanced Topics

Advanced Topics#

  • Dealing with Module Arguments
  • The Module lifecycle
  • Lifted Transformations
  • Convert PyTorch Models to Flax
  • Upgrading my Codebase to Optax
  • Upgrading my Codebase to Linen
  • Linen Design Principles
  • FLIPs

Contributing

  • How to Contribute
  • The Flax Philosophy
  • FLIPs

previous

Examples

next

Dealing with Module Arguments

By The Flax authors
© Copyright 2021, The Flax authors.