Releases: google-deepmind/dm-haiku
Releases · google-deepmind/dm-haiku
Haiku 0.0.3
Changelog:
- Added
hk.experimental.intercept_methods
. - Added
hk.running_init
. - Added
hk.experimental.name_scope
. - Added optional support for state in
custom_creator
andcustom_getter
. - Added index groups to
BatchNorm
. - Added interactive notebooks to documentation, including basics guide.
- Added support for batch major unrolls in
static_unroll
anddynamic_unroll
. - Added
hk.experimental.abstract_to_dot
. - Added step markers in imagenet example.
- Added
hk.MultiHeadAttention
. - Added option to remove double bias from
VanillaRNN
. - Added support for
feature_group_count
inConvND
. - Added logits config to resnet models.
- Added various control flow primitives (
fori_loop
,switch
,while_loop
). - Added
cross_replica_axis
toVectorQuantizerEMA
. - Added
original_shape
toParamContext
. - Added
hk.SeparableDepthwiseConv2D
. - Added support for
unroll
kwarg tohk.scan
. - Added
output_shape
argument toConvTranspose
modules. - Replaced
frozendict
withFlatMapping
, significantly reduces overheads calling jitted computations. - Misc changes to ensure parameter dtype follows input dtype.
- Multiple changes to support JAX omnistaging.
ExponentialMovingAverage.initialize
now takes shape/dtype not value.- Replaced optix with optax in examples.
hk.Embed
embeddings now created lazily.- Re-indexed documentation for easier navigation.
Haiku 0.0.2
Changelog:
- Changed the default value of
apply_rng
toTrue
inhk.transform
to simplify theapply_fn
signature. - Made
ConvND
,ConvNDTranspose
,ResetCore
and pooling modules optionally batched. - Added
hk.GroupNorm
. - Added
hk.scan
. - Changed
hk.BatchNorm
to always create state for moving averages. - Changed
use_projection
inhk.nets.ResNet
to take a sequence of bools. - Exposed
hk.net.ResNet.{BlockGroup, BlockV1, BlockV2}
. - Added
original_dtype
toParamContext
to expose the original parameter dtype to custom_getters. - Added
GAN
example notebook.
Haiku 0.0.1
Haiku is a simple neural network library for JAX developed by some of the
authors of Sonnet, a neural network library for TensorFlow.
Changelog:
Features:
- Exposed
hk.nets.ResNet
and addeedhk.nets.ResNet{18,34,101,152,200}
- Added
IdentityCore
. - Added
custom_getter
API for advanced parameter manipulation. - Added
ConvND
and liftedN<=3
restriction. - Added
tree_size
andtree_bytes
to easily compute parameter counts. hk.remat
now only threads changed values (faster compilation).- Added support for
@dataclass
to define modules. - Added support for splitting >1 key at a time
k1, k2 = hk.next_rng_keys(2)
. - Experimental: Added
profiler_name_scopes
API to add Haiku names to XProf. - Experimental: Added
optimize_rng_use
to improve compilation time for models with lots of RNG keys.
Examples:
- Added language model example.
- Added
VQVAE
example.
Bug fixes:
LayerNorm
now correctly handles bf16 inputs.TruncatedNormal
initializer now respects dtype.
Usability:
- Improved error messages for
get_parameter
,to_module
and others. - Reimplemented core modules with "public" API (easier to read and fork).
- Added tests that ensure all public symbols are included in documentation.
- Added type annotations to more internal code.
Haiku Beta Release
Changes
Examples
- Added VAE example.
- Added pruning example (https://arxiv.org/abs/1710.01878).
- MNIST example uses 300-100-10 MLP.
- Updated imagenet dataset to return correctly scaled examples.
Breaking changes
- State arg to
hk.transform
dropped in favor oftransform_with_state
. - Decay argument is now required in
BatchNorm
.
Features
- Added
hk.maybe_next_rng_key()
. - BatchNorm and LayerNorm speed improvements.
- Added support for partition/filter/merge params.
- Haiku now allows running with
jax_numpy_rank_promotion
.
Experimental features
hk.experimental.to_dot
- experimental visualisation support.hk.experimental.lift
- experimental purification support.
Usability
- Improved error message when RNG arg is not and RNG.
- Improved documentation.
- Improved test coverage.
Haiku Alpha Release
Haiku is a neural network library for JAX.