gpus.llm-utils.org Open in urlscan Pro
172.67.157.236  Public Scan

URL: https://gpus.llm-utils.org/fluidstack-vs-lambda-labs-vs-runpod-vs-tensordock/
Submission: On August 26 via api from US — Scanned from DE

Form analysis 1 forms found in the DOM

<form class="flex items-center flex-auto min-w-0">
  <div class="flex items-center justify-center w-8 h-8 text-neutral-400">
    <span class="relative inline-block align-text-bottom px-1 icon"><svg aria-hidden="true" focusable="false" data-prefix="fas" data-icon="search" class="svg-inline--fa fa-search fa-w-16" role="img" xmlns="http://www.w3.org/2000/svg"
        viewBox="0 0 512 512">
        <path fill="currentColor"
          d="M505 442.7L405.3 343c-4.5-4.5-10.6-7-17-7H372c27.6-35.3 44-79.7 44-128C416 93.1 322.9 0 208 0S0 93.1 0 208s93.1 208 208 208c48.3 0 92.7-16.4 128-44v16.3c0 6.4 2.5 12.5 7 17l99.7 99.7c9.4 9.4 24.6 9.4 33.9 0l28.3-28.3c9.4-9.4 9.4-24.6.1-34zM208 336c-70.7 0-128-57.2-128-128 0-70.7 57.2-128 128-128 70.7 0 128 57.2 128 128 0 70.7-57.2 128-128 128z">
        </path>
      </svg>
    </span>
  </div>
  <input type="search" id="search-query" class="flex flex-auto h-12 mx-1 bg-transparent appearance-none focus:outline-dotted focus:outline-2 focus:outline-transparent" placeholder="Search" tabindex="0">
</form>

Text Content

↓Skip to main content
GPU Utils ⚡️
 * ComputeWatch
 * Unofficial OpenAI Status
 * Posts
 * 


FLUIDSTACK VS LAMBDA LABS VS RUNPOD VS TENSORDOCK

July 2023
Table of Contents
 * Which GPU cloud should you use?
 * Runpod vs Lambda Labs vs FluidStack vs Tensordock
   * Runpod
   * FluidStack
   * Lambda Labs
   * Tensordock
 * Even more GPU cloud options


WHICH GPU CLOUD SHOULD YOU USE? #

 * H100s and A100 large quantities
   * Talk to Oracle, FluidStack, Lambda Labs. Maybe talk to CoreWeave, Crusoe,
     Runpod, AWS, Azure, GCP. Capacity is low.
 * 1x H100
   * FluidStack or Lambda Labs
 * A few A100s
   * FluidStack or Runpod
 * Cheap 3090s, 4090s, or A6000s
   * Tensordock
 * Stable Diffusion inference only
   * Salad.com
 * For accessing a wide variety of GPUs
   * Runpod or FluidStack
 * If you’re a hobbyist and want an easy start
   * Runpod
 * If you’re tied to an existing large cloud
   * Stick with them, I suppose!


RUNPOD VS LAMBDA LABS VS FLUIDSTACK VS TENSORDOCK #

Runpod is kind of a jack of all trades. Lots of GPU types. Solid pricing for
most. Easy deployment templates for beginners.

Tensordock is best if you need 3090s, 4090s, or A6000s - their prices are the
best.

Lambda Labs and FluidStack are kinda similar, similar pricing, Lambda has a
simpler interface but you’ll get used to FluidStack’s, FluidStack often has
better availability.


RUNPOD #

 * Pros:
   * Lots of GPU types
   * Good pricing
   * Cool templates
 * Cons:
   * What runpod provides is a docker container on a host machine. Not a VM with
     your required OS installed. Even with this it’s still a docker container
     with ssh with scp access enabled.
   * Tensordock pricing is better for 3090s, 4090s, A6000s
   * FluidStack and Lambda have better pricing on H100s

Best for: Beginners, A100s.


FLUIDSTACK #

 * Pros:
   * Good pricing on H100s
   * Generally the best option for A100 availability, along with Runpod, and
     good pricing
   * Good option for large quantities of H100s
 * Cons:
   * Interface can be confusing at first
   * Prices on ‘preconfigured machines’ are good, but non-preconfigured machines
     are expensive

Best for: A100s, H100s.


LAMBDA LABS #

 * Pros:
   * Nice interface
   * Good pricing on H100s
   * Good option for large quantities of H100s
 * Cons:
   * Poor availability
   * Had driver issues with their H100 instances

Best for: H100s.


TENSORDOCK #

 * Pros:
   * Marketplace pricing is great
   * Cheapest options for 3090s, 4090s, A6000s
 * Cons:
   * Non-marketplace pricing isn’t great
   * Minimal availability on A100s and no H100s

Best for: 3090s, 4090s, A6000s.


EVEN MORE GPU CLOUD OPTIONS #

In general, the ones above will be best for most people. But for more info: see
here, here, here, here and here.

--------------------------------------------------------------------------------

←→ Cloud GPU Guide July 2023 Serverless GPU Clouds July 2023 →←
↑

Get pre-release posts here.