Skip to content

Optax 0.0.5

Compare
Choose a tag to compare
@hbq1 hbq1 released this 23 Mar 18:42

Changelog

Note: this is a first GitHub release of Optax. It includes all changes since the repo was created.

Full Changelog

Implemented enhancements:

  • Implement lookahead optimiser #17
  • Implement support for Yogi optimiser #9
  • Implement rectified Adam #8
  • Implement gradient centralisation #7
  • Implement scaling by AdaBelief #6

Closed issues:

  • Multiple optimizers using optax #59
  • Change masked wrapper to use mask_fn instead of mask #57
  • Prevent creating unnecessary momentum variables #52
  • Implement Differentially Private Stochastic Gradient Descent #50
  • RMSProp does not match original Tensorflow impl #49
  • JITted Adam results in NaN when setting decay to integer 0 #46
  • Option to not decay bias with additive_weight_decay #25
  • Support specifying end_value for exponential_decay #21
  • Schedules for Non-Learning Rate Hyper-parameters #20
  • Implement OneCycle Learning Rate Schedule #19
  • adam does not learn? #18
  • Which JAX-based libraries is optax compatible with? #14
  • Manually setting the learning_rate? #4

Merged pull requests:

* This Changelog was automatically generated by github_changelog_generator