SDXL VRAM System Requirements – Recommended GPU, CPU, and RAM for Stable Diffusion to Run Locally
Constantly improving AI models requires more and more computing power. Running SDXL locally on old hardware may take ages per image. And if your GPU doesn’t pass the lower threshold in terms of VRAM, it may not work at all.
Here are the minimal and recommended local system requirements for running the SDXL model:
GPU for Stable Diffusion XL – VRAM Minimal Requirements
4GB VRAM – absolute minimal requirement. The preferred software is ComfyUI as it’s more lightweight. The base model will work on a 4 GB graphic card, but our tests show that it’ll be pushing it.
Recommended graphics card: ASUS GeForce GTX 1050 Ti 4GB
6GB VRAM – SDXL will work better than on a 4GB card, but it’s still not enough for comfortable work. Would you like to wait up to an hour to generate one 1024×1024 image?
Recommended graphics card: ASUS GeForce RTX 2060 6GB
8GB VRAM – users report that SDXL works fine on an 8GB graphic card. Image generation is fairly fast, but there are still complaints from users who work with Automatic1111. One generation takes about half a minute on a base model with a refiner.
Recommended graphics card: MSI Gaming GeForce RTX 3060 Ti 8GB
12GB VRAM – this is the recommended VRAM for working with SDXL. The generation is fast and takes about 20 seconds per 1024×1024 image with the refiner. Some users report that they’ve been able to train a 1024×1024 LoRA model on 12 VRAM.
Recommended graphics card: MSI Gaming GeForce RTX 3060 12GB
16GB VRAM can guarantee you comfortable 1024×1024 image generation using the SDXL model with the refiner. It’ll be faster than 12GB VRAM, and if you generate in batches, it’ll be even better.
Recommended graphics card: ASUS GeForce RTX 3080 Ti 12GB
24GB VRAM is enough for comfortable model fine-tuning and LoRA training, according to our tests. It shouldn’t take more than an hour and a half to train one. And image generation will only take seconds.
Recommended graphics card: ASUS TUF GeForce RTX 4090 24GB
CPU for Stable Diffusion Minimal requirements
There are no specific requirements or recommendations on CPU for SDXL. However, if you’re picking one for a future setup, make sure it matches the performance of the graphics card, and there are no bottlenecks.
How much RAM do you need for Stable Diffusion?
The more RAM, the better, but at least 32GB RAM is required.
During our tests, we discovered that working with SDXL with less than 32GB RAM might be challenging and really uncomfortable.
If you can, get more than 32GB RAM.
Does Stable Diffusion XL work on AMD graphics cards?
It’s been confirmed by the community members on Reddit that Stable Diffusion XL can work with AMD graphics cards (RX 6700, RX 6800, etc.) using Automatic1111 directml, but it works very slowly and takes a lot of SSD writes.
Using SDXL on Google Colab or services like ClipDrop is much more efficient.
Does Stable Diffusion XL work on Intel graphics cards?
Yes, it can. But you don’t want to experience that kind of torture when Google Colab, Clipdrop, and Discord bot is available for free.
Does Stable Diffusion XL work on Apple M1 processors?
It is possible, but the most popular software like Automatic1111 and other is designed and best suited for a Windows PC with an Nvidia GPU. Like using hires.fix upscaler (which is recommended in most tutorials) on M1 Mac will take forever. It is not recommended to use it on Mac.
You can also try running Stable Diffusion using DiffusionBee, software specifically made for M1/M2 chips. If you’re going to use custom models (checkpoints), make sure it’s an FP16 model, not FP32.
Can you run Stable Diffusion XL without a GPU?
You probably can, but we’d recommend not to. Unless you’re using Google Colab or other services and not run it locally.