gpus.llm-utils.org Open in urlscan Pro
2606:4700:3034::6815:5281  Public Scan

URL: https://gpus.llm-utils.org/gpus-for-running-stable-diffusion/
Submission: On August 23 via api from US — Scanned from DE

Form analysis 1 forms found in the DOM

<form class="flex items-center flex-auto min-w-0">
  <div class="flex items-center justify-center w-8 h-8 text-neutral-400">
    <span class="relative inline-block align-text-bottom px-1 icon"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="search" class="svg-inline--fa fa-search fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg"
        viewBox="0 0 512 512">
        <path fill="currentColor"
          d="M505 442.7L405.3 343c-4.5-4.5-10.6-7-17-7H372c27.6-35.3 44-79.7 44-128C416 93.1 322.9 0 208 0S0 93.1 0 208s93.1 208 208 208c48.3 0 92.7-16.4 128-44v16.3c0 6.4 2.5 12.5 7 17l99.7 99.7c9.4 9.4 24.6 9.4 33.9 0l28.3-28.3c9.4-9.4 9.4-24.6.1-34zM208 336c-70.7 0-128-57.2-128-128 0-70.7 57.2-128 128-128 70.7 0 128 57.2 128 128 0 70.7-57.2 128-128 128z">
        </path>
      </svg>
    </span>
  </div>
  <input type="search" id="search-query" class="flex flex-auto h-12 mx-1 bg-transparent appearance-none focus:outline-dotted focus:outline-2 focus:outline-transparent" placeholder="Search" tabindex="0">
</form>

Text Content

↓Skip to main content
GPU Utils ⚑️
 * ComputeWatch
 * Unofficial OpenAI Status
 * Posts
 * 


GPUS FOR RUNNING STABLE DIFFUSION

June 2023Β·Updated: July 2023

Note that this list is aimed at cloud GPUs where more expensive GPUs are
comparatively cheap vs buying the whole GPU outright.

You can run stable diffusion on smaller/cheaper GPUs!

GPU VRAM (GB) Speed relative to H100 for SD Speed / $ Lowest cost per hour Cost
at Runpod Cost at FluidStack Cost at Lambda Labs RTX 4090 24 50% πŸ‘Œ 0.72 $0.69 βœ…
$0.69 None None H100 PCIe 80 πŸ† 100% 0.50 $1.99 None βœ… $1.99 βœ… $1.99 RTX 3090 24
21% 0.49 πŸͺ™ $0.44 βœ… $0.44 $0.59 None RTX 3080 10 21% 0.43 $0.50 None $0.50 None
6000 Ada 48 48% 0.40 $1.19 $1.19 None None A100 40 43% 0.39 $1.10 None $1.20
$1.10 L40 48 43% 0.36 $1.19 $1.19 None None V100 16 24% 0.27 $0.87 None $0.87
None A6000 48 19% 0.24 $0.79 $0.79 $0.80 $0.80 A40 48 19% 0.24 $0.79 $0.79 $1.57
None A100 80 43% 0.24 $1.79 $1.79 $2.91 None

--------------------------------------------------------------------------------

←→ GPUs for Running MPT-30B June 2023 Accessing Large H100 Clusters June 2023 →←

Get pre-release posts here.