Skip to content

Commit 0fd67b2

Browse files
committed
updated news in README.md
1 parent 8f86489 commit 0fd67b2

2 files changed

Lines changed: 10 additions & 2 deletions

File tree

README.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -42,6 +42,14 @@ See our guide on running MaxText in decoupled mode, without any GCP dependencies
4242
## 🔥 Latest news 🔥
4343

4444
* \[December 10, 2025\] DeepSeek V3.1 is now supported. Use existing configs for [DeepSeek V3 671B](https://github.com/AI-Hypercomputer/maxtext/blob/main/src/MaxText/configs/models/deepseek3-671b.yml) and load in V3.1 checkpoint to use model.
45+
* \[December 9, 2025\] [New RL and SFT Notebook tutorials](https://github.com/AI-Hypercomputer/maxtext/tree/main/src/MaxText/examples) are available.
46+
* \[December 4, 2025\] The [ReadTheDocs documentation site](https://maxtext.readthedocs.io/en/latest/index.html) has been reorganized.
47+
* \[December 3, 2025\] Multi-host support for GSPO and GRPO is now available via [new RL tutorials](https://maxtext.readthedocs.io/en/latest/tutorials/posttraining/rl_on_multi_host.html).
48+
* \[November 20, 2025\] A new guide, [What is Post Training in MaxText?](https://maxtext.readthedocs.io/en/latest/tutorials/post_training_index.html), is now available.
49+
* \[November 6, 2025\] Ironwood TPU co-designed AI stack announced. Read the [blog post on its co-design with MaxText](https://cloud.google.com/blog/products/compute/inside-the-ironwood-tpu-codesigned-ai-stack?e=48754805).
50+
* \[October 29, 2025\] [Optimized models tiering documentation](https://maxtext.readthedocs.io/en/latest/reference/models/tiering.html) has been refreshed.
51+
* \[October 12, 2025\] Added Versioning. Check out our [first set of release notes](https://maxtext.readthedocs.io/en/latest/release_notes.html)!
52+
* \[October 10, 2025\] Post-Training (SFT, RL) via [Tunix](https://github.com/google/tunix) is now available.
4553
* \[September 26, 2025\] Vocabulary tiling ([PR](https://github.com/AI-Hypercomputer/maxtext/pull/2242)) is now supported in MaxText! Adjust config `num_vocab_tiling` to unlock more efficient memory usage.
4654
* \[September 24, 2025\] The GPT-OSS family of models (20B, 120B) is now supported.
4755
* \[September 15, 2025\] MaxText is now available as a [PyPI package](https://pypi.org/project/maxtext). Users can now [install maxtext through pip](https://maxtext.readthedocs.io/en/latest/guides/install_maxtext.html).

docs/release_notes.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,14 +18,14 @@
1818

1919
## PyPI Package
2020

21-
MaxText is [available in PyPI](https://pypi.org/project/maxtext/) and can be installed through pip. Please see our [MaxText Installation Guide](https://maxtext.readthedocs.io/en/latest/guides/install_maxtext.html) for setup instructions.
21+
MaxText is [available in PyPI](https://pypi.org/project/maxtext/) and can be installed through pip. Please see our [MaxText Installation Guide](https://maxtext.readthedocs.io/en/latest/install_maxtext.html) for setup instructions.
2222

2323
## Releases
2424

2525
### v0.1.0
2626

2727
Our first MaxText PyPI package is here! MaxText is a high performance, highly scalable, open-source LLM library and reference implementation written in pure Python/JAX and targeting Google Cloud TPUs and GPUs for training. We are excited to make it easier than ever to get started.
2828

29-
Users can now install MaxText through pip, both for local development and through stable PyPI builds. Please see our [MaxText Installation Guide](https://maxtext.readthedocs.io/en/latest/guides/install_maxtext.html) for more setup details.
29+
Users can now install MaxText through pip, both for local development and through stable PyPI builds. Please see our [MaxText Installation Guide](https://maxtext.readthedocs.io/en/latest/install_maxtext.html) for more setup details.
3030

3131
Going forward, this page will document notable changes as we release new versions of MaxText.

0 commit comments

Comments
 (0)