Skip to content

Commit fe88889

Browse files
committed
README update
1 parent 02ca045 commit fe88889

3 files changed

Lines changed: 6 additions & 6 deletions

File tree

.github/workflows/UnitTests.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ jobs:
5050
ruff check .
5151
- name: PyTest
5252
run: |
53-
HF_HUB_CACHE=/mnt/disks/github-runner-disk/ HF_HOME=/mnt/disks/github-runner-disk/ python3 -m pytest
53+
HF_HUB_CACHE=/mnt/disks/github-runner-disk/ HF_HOME=/mnt/disks/github-runner-disk/ python3 -m pytest -x
5454
# add_pull_ready:
5555
# if: github.ref != 'refs/heads/main'
5656
# permissions:

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,13 +35,13 @@ MaxDiffusion supports
3535
* Stable Diffusion 2 base (training and inference)
3636
* Stable Diffusion 2.1 (training and inference)
3737
* Stable Diffusion XL (training and inference).
38+
* Flux Dev and Schnell (Training and inference).
3839
* Stable Diffusion Lightning (inference).
3940
* Hyper-SD XL LoRA loading (inference).
4041
* Load Multiple LoRA (SDXL inference).
4142
* ControlNet inference (Stable Diffusion 1.4 & SDXL).
4243
* Dreambooth training support for Stable Diffusion 1.x,2.x.
4344

44-
**WARNING: The training code is purely experimental and is under development.**
4545

4646
# Table of Contents
4747

src/maxdiffusion/input_pipeline/_tfds_data_processing.py

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -137,10 +137,10 @@ def make_tfrecord_iterator(
137137
# set load_tfrecord_cached to True in config to use pre-processed tfrecord dataset.
138138
# pedagogical_examples/dataset_tf_cache_to_tfrecord.py to convert tf preprocessed dataset to tfrecord.
139139
# Datset cache in github runner test doesn't contain all the features since its shared, Use the default tfrecord iterator.
140-
if (config.cache_latents_text_encoder_outputs and
141-
os.path.isdir(config.dataset_save_location) and
142-
hasattr(config, 'load_tfrecord_cached') and
143-
config.load_tfrecord_cached):
140+
if (config.cache_latents_text_encoder_outputs
141+
and os.path.isdir(config.dataset_save_location)
142+
and 'load_tfrecord_cached'in config.get_keys()
143+
and config.load_tfrecord_cached):
144144
return make_cached_tfrecord_iterator(config, dataloading_host_index, dataloading_host_count, mesh, global_batch_size)
145145

146146
feature_description = {

0 commit comments

Comments
 (0)