Advertisement

Runpod Comfyui Template

Runpod Comfyui Template - Community cloud instances advertise 800 mbps yet i get throttled to 500 kbps. I've been building docker images for use on cloud providers (vast, runpod, tensordock etc) and i've just made them compatible with runpod serverless which can make image. With my experience so far, i cannot recommend it for anything beyond simple experimentation with. Aside from this, i'm a pretty happy runpod customer. Has anyone of you been successfully deployed a comfyui workflow serverless? Also does anyone have a rough cost estimate for training an sd 1.5 lora? Runpod is very rough around the edges, and definitely not production worthy. Runpod, on the other hand, works 100% of the time, but the network throttling is ridiculous. Or is this just how it is with the current gpu shortage? Are there any alternatives that are similar to runpod?

Aside from this, i'm a pretty happy runpod customer. Upload your sd models and such to a runpod (or another server) with one click. Is runpod still the best choice for both using and training sd 1.5 and sdxl checkpoints and loras? And would be willing to share some insights? Or is this just how it is with the current gpu shortage? Runpod is very rough around the edges, and definitely not production worthy. Maybe on aws or runpod? Runpod, on the other hand, works 100% of the time, but the network throttling is ridiculous. Runpod's prices have increased and they now hide important details about server quality. Also does anyone have a rough cost estimate for training an sd 1.5 lora?

ComfyFlow RunPod Template ComfyFlow
ComfyFlow RunPod Template ComfyFlow
Blibla ComfyUI on RunPod
GitHub ComfyUI docker images
ComfyUILauncher/cloud/RUNPOD.md at main ·
Manage Pod Templates RunPod Documentation
ComfyUI Tutorial How To Install ComfyUI On Windows, RunPod, 40 OFF
GitHub Docker image for runpod
at master
ComfyFlow RunPod Template ComfyFlow

Runpod, On The Other Hand, Works 100% Of The Time, But The Network Throttling Is Ridiculous.

After getting one too many low quality servers, i'm not using runpod anymore. Aside from this, i'm a pretty happy runpod customer. Runpod is very rough around the edges, and definitely not production worthy. Or is this just how it is with the current gpu shortage?

And Would Be Willing To Share Some Insights?

Has anyone of you been successfully deployed a comfyui workflow serverless? Upload your sd models and such to a runpod (or another server) with one click. I've been building docker images for use on cloud providers (vast, runpod, tensordock etc) and i've just made them compatible with runpod serverless which can make image. Is there a way to.

Also Does Anyone Have A Rough Cost Estimate For Training An Sd 1.5 Lora?

With my experience so far, i cannot recommend it for anything beyond simple experimentation with. Are there any alternatives that are similar to runpod? Is runpod still the best choice for both using and training sd 1.5 and sdxl checkpoints and loras? Runpod's prices have increased and they now hide important details about server quality.

Community Cloud Instances Advertise 800 Mbps Yet I Get Throttled To 500 Kbps.

Maybe on aws or runpod? I wish runpod did a better job of detecting and explaining this state, but unfortunately they leave it up to the user to discover for now.

Related Post: