Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ffhq train #9

Open
Passenger12138 opened this issue Apr 24, 2023 · 1 comment
Open

ffhq train #9

Passenger12138 opened this issue Apr 24, 2023 · 1 comment

Comments

@Passenger12138
Copy link

I would like to know when you train ffhq -use_fp16 parameter Settings, if you can provide training ffhq training Settings more extended than in the paper similar to the following way I would be grateful,I wonder if this setup is consistent with your training.

python scripts/image_train.py --data_dir ./data --attention_resolutions 16 --class_cond False --diffusion_steps 1000 --dropout 0.0 --image_size 256 --learn_sigma True --noise_schedule linear --num_channels 128 --num_head_channels 64 --num_res_blocks 1 --resblock_updown True --use_fp16 False --use_scale_shift_norm True --lr 2e-5 --batch_size 8 --rescale_learned_sigmas True --p2_gamma 0.5 --p2_k 1 --log_dir logs 
@jychoi118
Copy link
Owner

Unfortunately, we did not use fp16 for training. And our full setting is noted in readme, which is:
python scripts/image_train.py --data_dir data/DATASET_NAME --attention_resolutions 16 --class_cond False --diffusion_steps 1000 --dropout 0.0 --image_size 256 --learn_sigma True --noise_schedule linear --num_channels 128 --num_head_channels 64 --num_res_blocks 1 --resblock_updown True --use_fp16 False --use_scale_shift_norm True --lr 2e-5 --batch_size 8 --rescale_learned_sigmas True --p2_gamma 1 --p2_k 1 --log_dir logs

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants