Google Cloud demonstrated a LLM training job over 50000+ TPU v5e chips, in JAX! #18478
Unanswered
hawkinsp
asked this question in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
https://cloud.google.com/blog/products/compute/the-worlds-largest-distributed-llm-training-job-on-tpu-v5e
This was done using JAX and MaxText (https://github.com/google/maxtext), amongst other libraries.
Beta Was this translation helpful? Give feedback.
All reactions