arda-argmax's picture
Update mmdit_2b/config.json
cf9425f verified
raw
history blame contribute delete
280 Bytes
{
"depth": 24,
"mlp_ratio": 4,
"vae_latent_dim": 16,
"layer_norm_eps": 1e-06,
"max_latent_resolution": 96,
"patch_size": 2,
"pooled_text_embed_dim": 2048,
"token_level_text_embed_dim": 4096,
"frequency_embed_dim": 256,
"max_period": 10000
}