Always Getting "Out of Memory" Error (Unable to Allocate)
GPU memory is full. Use smaller models, such as small, medium, or base, instead of large-v3. The minimum requirement for large-v3 is 8GB VRAM, but that doesn't guarantee smooth performance on an 8GB GPU, as other applications also use VRAM, and larger videos require more memory. If you encounter this error, try the following:
- Use a smaller model, such as
small,medium, orbase. - If you still want to use a large model, choose "Pre-segment" or "Equal Split" options.
- Modify settings in the menu bar under Tools/Options > Advanced Options:
Change CUDA data type from float32 to int8; if errors occur, switch to float16.
Change beam_size from 5 to 1.
Change best_of from 5 to 1.
Set context to false (from true).