Skip to content

Always Getting "Out of Memory" Errors (whole all Unable to allocate)

Your VRAM is full. Use smaller models, such as medium or small instead of large-v3. The large-v3 model has a minimum VRAM requirement of 8GB. However, having an 8GB graphics card doesn't guarantee smooth operation because other software also consumes VRAM, and larger videos demand more. When you encounter this error, please try the following:

  1. Use a smaller model, such as small/medium/base.
  2. If you still want to use a large model, choose "Pre-Segmentation" or "Equal Segmentation."
  3. Modify the settings in the Menu Bar -- Tools/Options -- Advanced Options:

CUDA data type = float32, change to int8. If you get an error, change it to float16.
beam_size = 5, change 5 to 1.
best_of = 5, change 5 to 1.
Context, change true to false.