8000 GitHub - ezfman/midi-model: Midi event transformer for symbolic music generation
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

ezfman/midi-model

 
 

Repository files navigation

Midi-Model

Midi event transformer for music generation

Updates

  • v1.3: MIDITokenizerV2 and new MidiVisualizer
  • v1.2 : Optimise the tokenizer and dataset. The dataset was filtered by MIDITokenizer.check_quality. Using the higher quality dataset to train the model, the performance of the model is significantly improved.

Demo

Pretrained model

huggingface

Dataset

projectlosangeles/Los-Angeles-MIDI-Dataset

Requirements

  • install pytorch(recommend pytorch>=2.0)
  • install fluidsynth>=2.0.0
  • pip install -r requirements.txt

Run app

python app.py

Train

python train.py

Citation

@misc{skytnt2024midimodel,
  author = {SkyTNT},
  title = {Midi Model: Midi event transformer for symbolic music generation},
  year = {2024},
  howpublished = {\url{https://github.com/SkyTNT/midi-model}},
}

About

Midi event transformer for symbolic music generation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 78.5%
  • Jupyter Notebook 11.4%
  • JavaScript 9.9%
  • Dockerfile 0.2%
0