Skip to main content

Set up Blender for cloud rendering

Configure Blender for cloud rendering with Cycles and EEVEE.


Blender on our farm runs Cycles (CPU and GPU/Optix) and Eevee Next, the two production renderers in Blender 4.x. This page covers project packaging — which is simpler in Blender than in most DCCs because of how Blender handles asset paths — per-renderer notes, multi-camera animation rendering, and Blender-specific troubleshooting.

For high-level positioning of Blender on our farm — pricing examples, GPU vs CPU choice, supported add-ons — see the article (V-Ray article includes Blender context). For specific Cycles GPU benchmarks on RTX 5090, see .

Supported versions

Blender 4.0, 4.1, 4.2 (LTS), 4.3, and 4.4 are pre-installed on every worker. We support Blender's LTS releases as the recommended choice for production work, since Blender 4.2 LTS receives bug fixes through 2026 and is the stable target for most studios. The farm matches your .blend file's saved version automatically.

A note on Blender's release rhythm: Blender ships a major release every three to four months plus an LTS release annually. We provision new releases within two weeks of their public availability — Blender's release cadence is faster than the proprietary DCCs, so the farm tracks closely.

Packaging your Blender project

A Blender project is the .blend file plus any external assets — textures, sound files, simulation caches, linked library .blend files, and EXR HDRIs for environment lighting. Blender's path resolution is more lenient than other DCCs (it uses both absolute and relative paths and falls back through several search locations), but for cloud rendering you should pack everything into a consistent project folder.

The simplest packaging method is Blender's built-in "Pack Into .blend" feature:

  1. Open your project. File → External Data → Find Missing Files (to verify nothing is already broken locally).
  2. Pack assets into the .blend file. File → External Data → Pack All Into .blend. Blender embeds all textures, sounds, and linked images into the .blend file itself.
  3. Save the project. The .blend file is now self-contained — typically 10x to 100x larger than the original, depending on texture count.
  4. Upload the single .blend file (no archive needed if your .blend is under a few GB; for larger files, wrap in .tar.gz or .7z for faster upload).

A second packaging method that produces a smaller upload is "Make Paths Relative" + manual folder structuring:

  1. File → External Data → Make All Paths Relative. All asset paths become relative to the .blend file's directory.
  2. Verify every asset resolves with File → External Data → Report Missing Files. The report should show no missing files.
  3. Place all referenced assets in subfolders relative to the .blend. Standard structure: project/scene.blend plus project/textures/, project/cache/, project/hdr/, etc.
  4. Archive the whole folder as .tar.gz or .7z and upload.

The relative-paths method is preferable for very large texture libraries (where Pack Into .blend would create an unwieldy single file) and for projects that reuse the same texture library across multiple scenes.

What to verify before submission

A short pre-flight checklist:

  • Active render engine is set. Properties → Render Properties → Render Engine determines whether the worker uses Cycles, Eevee Next, or Workbench. Make sure it matches your project's intent.
  • Frame range is set. Properties → Output Properties → Frame Start / Frame End. The farm respects this range exactly.
  • Output path uses relative tokens. The default //tmp/####.png is fine; the // prefix means "relative to the .blend file." Avoid absolute output paths like D:\renders\.
  • Output format matches your downstream pipeline. PNG sequence with alpha for compositing, OpenEXR Multilayer for full pass output, FFmpeg video for direct delivery. For animation rendering, image sequences are strongly preferred over direct video output.
  • Color management is set. Properties → Render Properties → Color Management. The default Filmic + sRGB display works for most projects. If your project uses ACES or a custom OCIO config, include the OCIO config files in your project folder and verify the path resolves on the worker.
  • Active camera is set. The scene's active camera (Properties → Scene Properties → Camera) determines which camera renders. Make sure it matches what you expect.

Renderer-specific notes

Cycles (CPU)

Cycles CPU runs on our Dual Intel Xeon E5-2699 V4 worker tier (up to 256 GB RAM per node). It is the choice for scenes that exceed GPU VRAM, scenes with very large texture libraries, or projects that need the exact bit-perfect output Cycles CPU produces.

Configuration notes:

  • License: Blender is free and open source; no licensing concerns on the farm.
  • Sampling: Cycles Render Properties → Sampling → Render Samples. Adaptive Sampling with a Noise Threshold of 0.01 produces clean output for animation; lower thresholds (0.005, 0.002) for higher quality at the cost of render time.
  • Denoising: Cycles supports OpenImageDenoise (OIDN) and Optix (GPU-only). OIDN runs on CPU and produces good results for stills and animation; configure in Render Properties → Sampling → Denoise.
  • Light tree: Blender 3.4+ includes light tree sampling for many-light scenes. Enable in Render Properties → Sampling for scenes with hundreds of light sources.

Cycles (GPU / Optix)

Cycles GPU runs on our NVIDIA RTX 5090 worker tier (32 GB VRAM per card). It is significantly faster than Cycles CPU for most scenes (typically 5–15× speedup), particularly for projects with heavy ray tracing.

Configuration notes:

  • Device: Properties → Render Properties → Device → GPU Compute. The Optix backend (which uses RT cores on NVIDIA GPUs) is enabled by default on our workers.
  • VRAM constraints: 32 GB VRAM per worker is enough for most archviz and animation projects. For projects approaching the VRAM limit, the "Persistent Data" option in Render Properties → Performance reduces per-frame load times but increases peak VRAM use. For projects exceeding 32 GB, switch to Cycles CPU or split the scene into renderable chunks.
  • Optix denoiser: Faster than OIDN for animation. Configure in Render Properties → Sampling → Denoise → Use OptiX. Requires NVIDIA GPU (which our worker tier provides).

Eevee Next

Eevee Next runs on our GPU worker tier. It is the choice for motion design, stylized renders, and projects where Eevee's specific real-time-derived look is preferred over Cycles' physical accuracy.

Configuration notes:

  • Sampling and reflections: Eevee Next's Render Properties → Sampling controls the per-pixel sample count. For finals, 64–128 samples typically produces clean output. Light probes (for indirect lighting) and screen-space reflections / refractions must be baked or configured per-shot.
  • Eevee Next vs Eevee Legacy: Blender 4.2+ includes Eevee Next, which is the new architecture going forward. Older projects authored in Blender 3.x using Eevee Legacy may need adjustment when migrated. The farm supports both, but if you save a project in 4.2+ the active engine name is BLENDER_EEVEE_NEXT; older BLENDER_EEVEE projects open in compatibility mode.
  • Volumetrics: Eevee Next's volumetric pipeline differs significantly from the legacy version. Verify volumes render as expected in a local test before submitting an animation.

Cycles GPU vs Eevee Next quick comparison

| Concern | Cycles GPU | Eevee Next | |---|---|---| | Render speed (typical scene) | Slower per-frame, physically accurate | Faster per-frame, real-time-derived | | Photorealism | Higher | Lower (but improving rapidly) | | Animation flicker | Low (with Optix denoiser temporal mode) | Can be higher; needs light probe baking | | VRAM constraint | 32 GB hard limit; fallback to CPU | 32 GB but typically lower usage | | Best for | Archviz, VFX, photorealistic animation | Motion design, stylized, fast iteration |

Multi-camera angle render

For projects that need to render the same scene from multiple camera angles in a single submission, Blender supports two patterns:

Pattern 1: Multiple scenes with different active cameras

Blender allows multiple scenes per .blend file. Each scene can have its own active camera and render settings:

  1. In your project, create a new scene per camera angle (Scene → New Scene → Link Objects).
  2. Set each scene's active camera to the relevant camera.
  3. When submitting, the farm renders the scene set as "Active" at submit time. To render multiple scenes, submit each as a separate job.

Pattern 2: View Layer per camera (Blender 2.8+)

A more efficient pattern for many camera angles is to use View Layers with different active cameras:

  1. In Properties → View Layer, create a View Layer per camera angle.
  2. In Properties → Output Properties → Output → Use Multi-View, configure stereo or multi-view if applicable.
  3. Configure output paths per View Layer using the {layer} token in the output filename.
  4. Submit; the worker renders all enabled View Layers in one pass.

For typical archviz turntables (8–16 camera angles), Pattern 2 is significantly more efficient because the scene loads once per frame and renders all views from the same loaded scene.

Submission flow

Three submission channels work for Blender projects:

  • Web upload + submit via dashboard. Upload the packed .blend (or archived project folder), then submit through the website. This is the most common submission path for Blender users.
  • Client App. Upload + submit + auto-download in one wrapper.
  • Submission plugin. A Blender add-on for one-click submit from inside Blender is available; install from your account dashboard.

For the cross-DCC upload-submit-download flow, see .

Troubleshooting Blender-specific failures

For general troubleshooting that applies across DCCs, see . Blender-specific cases:

  • Some objects in scene not rendering. Most common cause: the object's "Render Visibility" toggle is disabled. In the outliner, hover over the object's row and check that the camera icon (Disable in Renders) is enabled. Also check that the object is not on a hidden View Layer.
  • Missing textures despite Pack Into .blend. Verify Pack All Into .blend was run after the latest texture changes. If you reloaded textures after packing, they may have reverted to external references. Re-pack and re-save.
  • Render returns black or wrong color. Check Color Management settings (Properties → Render Properties → Color Management). The most common cause is a View Transform mismatch — make sure both your workstation and the worker use the same View Transform (Filmic is the default and safest choice).
  • Eevee Next light probes not baking. Light probes need to be baked locally and the baked data needs to be saved with the .blend. If light probes are configured but not baked, the worker may produce flat or missing indirect lighting. Bake locally before submission.
  • Cycles GPU job runs on CPU instead. Verify Properties → Render Properties → Device is set to GPU Compute, not CPU. Also check that the Optix backend is selected if your scene requires hardware ray-tracing acceleration.
  • OSL custom shader fails on render. Cycles supports OSL (Open Shading Language), but custom OSL shaders need to be included in your project archive. The OSL shader files (.osl) must be in a location the worker can find — the simplest is to place them in the same folder as the .blend and reference them by relative path.
  • Add-on missing on worker. Common add-ons (Animation Nodes, BlenderKit, Sverchok, FLIP Fluids, etc.) may not be pre-installed. For Animation Nodes and similar procedural add-ons, the workaround is to bake procedural geometry to meshes before submission. For other add-ons, contact support to discuss adding to the worker image.

Cross-references

  • — upload, submit, download workflow
  • — how Blender job costs are calculated
  • — SFTP guide, archive formats
  • — cross-DCC troubleshooting
  • — installing the Blender add-on
  • — benchmark article

FAQ

Q: Which Blender versions does the farm support? A: Blender 4.0, 4.1, 4.2 LTS, 4.3, and 4.4 are pre-installed on every worker. We follow Blender's release cadence and provision new versions within two weeks of public availability. Blender 4.2 LTS is the recommended choice for production work because it receives bug fixes through 2026.

Q: Should I render with Cycles or Eevee Next? A: Cycles for photorealistic archviz, VFX, and animation where physically-accurate light transport matters. Eevee Next for motion design, stylized work, and fast iteration. Cycles GPU on our RTX 5090 fleet is typically 5–15× faster than Cycles CPU for the same scene, so GPU is the default recommendation unless your scene exceeds 32 GB VRAM.

Q: My scene uses BlenderKit / Sverchok / Animation Nodes — will it render? A: BlenderKit assets that you've already downloaded and saved into the .blend will render fine (they become standard mesh data once placed). Sverchok and Animation Nodes are procedural — if the procedural geometry has not been baked to a static mesh, the worker may produce unexpected output. Bake procedural geometry to meshes (Sverchok: Set/Bake to Mesh; Animation Nodes: bake to action) before submission.

Q: Do I need to pack textures into the .blend, or can I upload them as separate files? A: Either works. Pack Into .blend is the simplest because the result is a single self-contained file. Relative-paths + folder structure is preferable for very large texture libraries where Pack Into .blend would produce an unwieldy single file. Make sure you run "Find Missing Files" before either approach to confirm nothing is broken locally first.

Q: My render finishes but the colors look different from my workstation viewport. A: Most often this is a Color Management or View Transform mismatch. Check Properties → Render Properties → Color Management on both your workstation and verify the saved settings include View Transform: Filmic (the default). Look Transforms (high contrast, low contrast, etc.) and Exposure values are also embedded in the .blend and apply on the worker.

Q: I'm rendering an animation. Should I output to video file or image sequence? A: Image sequence, almost always. PNG with alpha (for compositing) or OpenEXR Multilayer (for full passes) is the standard. Direct video output (FFmpeg) does not parallelize well across the farm — the worker that gets the job renders all frames sequentially and encodes one video file. Image sequences let the farm distribute frames across multiple workers in parallel, which is significantly faster for any animation over ~100 frames.

Q: My project uses custom OSL shaders. Will they work on the farm? A: Yes, if the .osl shader files are included in your project archive and referenced via relative paths in the shader nodes. The simplest pattern is to place all .osl files in the same folder as your .blend file.

Q: How does GPU rendering compare in cost vs CPU on the farm? A: GPU rendering is typically faster per-frame (5–15× for Cycles), but the per-worker cost is higher because GPU machines are expensive. The total cost for a render is roughly comparable between Cycles CPU and Cycles GPU for most scenes — GPU finishes faster but costs more per minute. For specific cost estimates based on your scene, the gives a per-frame and per-job comparison.

Q: Can I render Blender simulations (fluid, smoke, hair) on the farm? A: For simulations, the cache files need to be baked locally first, then included in your project upload. The farm renders frames against the baked cache; it does not run live simulation. This is the standard pattern across all DCCs — simulation baking is workstation work, frame rendering is farm work.

---

[PHASE 2D: cover image needed — 1200x675px WebP]

Last updated: May 13, 2026