MS3.2-24B-Penumbra-Aether

Overview

MS3.2-24B-Penumbra-Aether was created by merging MS3.2-24B-Chaos-Skies, Cydonia-24B-v4.3, Hearthfire-24B, and ms3.2-24b-longform using a custom method.

Merge configuration
base_model: Vortex5/MS3.2-24B-Chaos-Skies
models:
  - model: TheDrummer/Cydonia-24B-v4.3
  - model: LatitudeGames/Hearthfire-24B
  - model: Burnt-Toast/ms3.2-24b-longform
merge_method: hpq
parameters:
  strength: 0.78
  flavor: 0.48
  paradox: 0.45
  cube_dims: 20
  steps: 10
  boost: 0.50
dtype: bfloat16
tokenizer:
  source: Vortex5/MS3.2-24B-Chaos-Skies
      

Intended Use

๐Ÿ“œ Storytelling
๐ŸŽญ Roleplay
๐ŸŒŒ Creative Writing
Downloads last month
32
Safetensors
Model size
24B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Vortex5/MS3.2-24B-Penumbra-Aether