Skip to main content
Real-Time Ray Tracing in Games: The Evolution from Quake to 2026

Real-Time Ray Tracing in Games: The Evolution from Quake to 2026

ByThierry Marc
13 min read

In 2018, NVIDIA released something that felt impossible: Quake II RTX, a real-time remaster of the legendary 1997 shooter, rendered with full path tracing on a single GPU. The original Quake II, groundbreaking for its time, used pre-baked lightmaps and basic rasterization. RTX-enabled GPUs could now compute light bounces interactively, transforming the game's appearance from flat polygons to photorealistic lighting.

This single demo marked a watershed moment. Real-time ray tracing, once relegated to offline rendering in films, had entered games. Eight years later, in 2026, ray tracing is standard in AAA titles. The convergence of real-time and offline rendering technology is nearly complete—techniques developed for film are shipping in games, and game optimization methods inform film rendering.

This guide traces that evolution: how we went from Quake II RTX as a technical marvel to full path tracing as a graphical baseline, and what that means for the industry.

The Quake II RTX Watershed

Quake II RTX was a proof of concept, but it was a powerful one. A 24-year-old game, reimagined with modern light transport simulation, felt brand new. Reflections were physically correct. Shadows from multiple light sources integrated seamlessly. Materials behaved realistically: metal reflected the environment, glass transmitted light without the banding of fake refraction.

The demo ran on RTX 2080 Ti hardware at 1080p, 60 frames per second. This was remarkable because it should have been impossible. Path tracing a scene with multiple bounces, producing noise that required denoising, all in milliseconds—this violated assumptions about rendering complexity.

The key innovation: NVIDIA's RT cores. These specialized hardware units accelerated ray-scene intersection, the bottleneck in ray tracing. Combined with NVIDIA's DLSS (Deep Learning Super Sampling), which used AI to reconstruct high-resolution images from lower-resolution rendering, RTX made ray tracing viable in real-time.

Quake II RTX's impact was psychological as much as technical. Gamers saw photorealistic lighting in a real-time game. The conversation shifted from "Can we do ray tracing in real-time?" to "When will everything use ray tracing?"

The RTX Series Evolution

NVIDIA's RTX GPU lineage progressed in three generations:

RTX 2000-Series (2018-2019): The original RTX 2080 Ti had limited RT cores and tensor cores (for DLSS). DLSS 1.0 was neural network-based upsampling. Results were good but sometimes had artifacts.

RTX 3000-Series (2020-2021): Significantly faster RT cores, more transistors, and DLSS 2.0. DLSS 2.0 shifted from game-specific training to temporal reconstruction: it rendered at lower resolution, fed multiple frames to the AI, and reconstructed high-quality images. Artifacts largely disappeared. DLSS 2.0 was a massive leap.

RTX 4000-Series (2022-2023): Even more RT cores, more memory, and DLSS 3.0 with frame generation. Frame generation doesn't just upscale; it generates entirely new frames between rendered frames, effectively doubling frame rate. A game rendering at 30fps could present at 60fps through frame generation.

RTX 5000-Series (2024-2026): Latest hardware with even more capability. Frame generation is standard, and AI acceleration has multiplied.

Each generation made ray tracing faster and more practical. By RTX 4000, ray tracing wasn't aspirational—it was baseline for AAA games.

Key Games and Industry Adoption

Several AAA titles exemplify real-time ray tracing's maturation:

Cyberpunk 2077 (2020): A landmark for ray-traced reflections and lighting in a massive open world. Initial performance was rough (the entire game struggled on launch), but once optimized, it demonstrated that complex ray-traced scenes were viable. Cyberpunk 2077 with ray tracing at high settings showed that interactive photorealism was real.

Alan Wake 2 (2023): Perhaps the most ambitious ray-traced game yet. Full path-traced global illumination (not just reflections), complex volumetrics, and intricate shader networks. Alan Wake 2 runs on RTX 3000 and 4000 hardware at 1080p-1440p with reasonable frame rates through DLSS 3 and frame generation. The image quality is genuinely photorealistic.

Portal RTX (2024): A free remaster of Portal, fully ray-traced. What's remarkable is how simple the core gameplay remains—you still place portals and solve puzzles—but the visuals are utterly transformed. Ray-traced reflections in portal surfaces, physically accurate lighting, photorealistic materials. Portal RTX demonstrates ray tracing's transformative power even in conceptually simple games.

Unreal Engine 5's Lumen: Not a game per se, but a rendering technology shipped in UE5. Lumen is a real-time global illumination system that uses ray tracing (GPU-accelerated, but not hardware-accelerated RT cores) to compute lighting dynamically. Developers can skip pre-baked lightmaps entirely. This is a fundamental shift in how real-time games are rendered.

Understanding the Technology Gap

For those coming from offline rendering, real-time ray tracing's constraints are important:

Sampling and Denoising: A film renderer might cast 1,000 samples per pixel. A real-time renderer casts 1-8 samples per pixel and denoise heavily. The denoiser (often AI-based) reconstructs a high-quality image from sparse samples. This works because temporal information (previous frames) and spatial information (neighboring pixels) constrain the denoiser.

Frame Generation vs. Frame Rendering: Rendering 60 frames per second in ray tracing is expensive. DLSS 3's frame generation renders every other frame (30fps) and AI-generates the in-between frames (presenting 60fps to the player). This is controversial—some argue generated frames introduce latency or artifacts. In practice, modern frame generation is convincing.

Path Tracing vs. Hybrid Approaches: Full path tracing (every bounce computed by ray tracing) is rare in games because it's slow. Hybrid approaches are standard: screen-space reflections for nearby surfaces, ray-traced reflections for distant objects, baked ambient occlusion, ray-traced global illumination. These hybrids deliver 80% of the visual quality at 20% of the cost.

Offline vs. Real-Time: Offline renders can spend seconds per frame. Real-time renders need milliseconds. A 60fps game has 16ms per frame for rendering, AI upscaling, frame generation, and everything else. This constraint drives different algorithm choices. Offline film rendering is computationally luxurious by comparison.

The Convergence of Real-Time and Offline

Here's what's fascinating: techniques developed for games are now used in film rendering, and vice versa.

From Games to Film:

DLSS-like AI upsampling is now used in render farms. If you render at 50% resolution with denoising and AI reconstruction, you can render twice as fast with minimal quality loss. We've seen commercial render farms adopt similar strategies for V-Ray and Arnold.

Frame generation concepts are influencing animation rendering pipelines. Motion estimation and interpolation, core to frame generation, are useful for temporal coherence in animation rendering.

Real-time path tracing optimization techniques inform offline rendering. Fast sampling methods, adaptive sampling strategies, and denoising pipelines developed for games are adapted for offline use.

From Film to Games:

Full path tracing, once exclusive to offline, is now appearing in games (Alan Wake 2). Advanced shader networks and material complexity are adopted from offline rendering.

Asset quality and detail from film production is shipping in games. Scanned materials, high-polygon models, and physically-based shading are game standard now.

Temporal reconstruction techniques from denoising are adapted to frame generation.

In 2026, the boundary between real-time and offline rendering is blurred. They're increasingly the same technology, applied with different performance constraints.

Current State of Full Path Tracing in Games

In 2026, a few games are fully path-traced (every bounce computed by ray tracing):

  • Portal RTX (simple geometry, manageable)
  • Alan Wake 2 (with DLSS 3 and frame generation enabling high visual quality)
  • Emerging indie titles exploring pure path tracing

But most AAA games remain hybrid. Ray-traced reflections and shadows, but baked global illumination or screen-space approximations. This is sensible—hybrid approaches deliver 90% of the visual quality at 50% of the cost.

Full path tracing at 60fps, 1440p+ resolution, with no denoising or frame generation, is still impractical. For that, ray-traced reflections plus other approximations remain the standard.

DLSS and Frame Generation's Role

DLSS deserves its own section because it's critical to real-time ray tracing's success.

DLSS renders at lower resolution (typically 67-75% of native), then AI reconstructs the missing pixels. DLSS 2.0 used temporal data; DLSS 3.0 extends this with frame generation.

Performance impact: a game running ray-traced reflections at native 1440p might get 30fps. With DLSS, rendering at ~950p with reconstruction, it achieves 60fps with visually equivalent quality.

Controversially, some argue DLSS introduces latency (a frame of delay between input and output). In practice, latency is minimal in modern DLSS, but competitive gaming communities debate its fairness.

For casual games and visual quality, DLSS is a win: better visuals, higher frame rates. For competitive esports titles, the latency tradeoff is debated.

DLSS's success spawned competitors: AMD's FSR, Intel's XeSS. These alternatives lack the neural network training of DLSS but are more open. By 2026, all major GPUs have some form of AI upsampling.

Hardware Requirements and Market Impact

Real-time ray tracing requires recent GPU hardware. RTX 3000-series or newer from NVIDIA, RDNA2+ from AMD, or Arc from Intel enables practical ray tracing.

This has market implications: older GPUs can't play modern ray-traced games at quality levels. The gap between "minimum specs" and "high-quality" specs has widened. A 1080 Ti (released 2017) struggles with modern ray-traced games. RTX 2080 Ti (2018) can handle them. RTX 3080+ is comfortable.

For console gaming, PlayStation 5 and Xbox Series X have ray tracing hardware (custom NVIDIA/AMD designs), enabling ray tracing in console games. Game developers now assume hardware ray tracing support across their target platforms.

The Future: Path Tracing and Beyond

In 2026, full path tracing in real-time is emerging but not standard. What's the trajectory?

Moore's Law and Hardware: GPUs continue improving 20-30% annually. In 5 years (2031), real-time path tracing at quality settings might be routine. Full path tracing at 60fps, 4K, with minimal compromises could be viable by then.

AI Reconstruction: AI upsampling is only improving. If denoising and reconstruction continue advancing, you could render path-traced scenes at very low sample counts and reconstruct high-quality images. The "good enough" sample count might drop from 4 to 1, unlocking 4x speedups.

Compression and Streaming: Cloud gaming and game streaming are evolving. If games can be rendered in datacenters and streamed to clients, local hardware constraints vanish. Ray tracing on cloud render farms (similar to offline rendering) could be streamed in real-time. This is technically challenging but conceptually viable.

Offline-Like Rendering in Real-Time: The ultimate convergence: render farm technology applied to real-time games. Distribute rendering across many GPUs, aggregate the results, stream to players. This would enable visuals approaching offline film quality in interactive contexts.

This is speculative, but the direction is clear: more ray tracing, more path tracing, more AI reconstruction, less distinction between real-time and offline.

Implications for Content Creators

For game developers, real-time ray tracing is now a required consideration. AAA games are expected to support ray tracing. This means understanding ray tracing performance, learning DLSS/upsampling tools, and optimizing accordingly.

For 3D artists, this means understanding physically-based shading at a deeper level. Materials that looked fine under rasterization show artifacts under ray tracing. Artists need to calibrate materials for ray-traced contexts.

For studios rendering offline content, real-time ray tracing techniques offer optimization paths. Why render a sequence offline for 24 hours when real-time rendering plus temporal reconstruction achieves similar quality in seconds?

Render Farm Implications

At Super Renders Farm, ray tracing and real-time techniques influence our infrastructure:

We support GPU-heavy rendering pipelines. Clients rendering Unreal Engine 5 with Lumen or ray-traced content need GPU capacity, which we provide at scale.

AI reconstruction and denoising are routine steps in render farm pipelines now. We integrate DLSS-like tools into post-processing.

Some clients are exploring hybrid approaches: render scenes with real-time techniques, output to offline render farms for final polish. The boundaries are genuinely blurring.

FAQ

Q: Is ray tracing actually real-time in 2026, or is it mostly denoising trickery? A: Both. Modern ray-traced games truly ray-trace reflections, shadows, and global illumination. But they're heavily optimized: lower sample counts, denoising, and AI reconstruction are essential. The final image is genuine ray tracing plus intelligent reconstruction. It's not "fake" rendering—it's optimized real-time path tracing. Compare it to film rendering, which is optimized denoising plus heavy sampling. Different scale, same philosophy.

Q: Should I buy a GPU for ray tracing gaming? A: If you're buying a modern GPU for gaming, yes—ray tracing is present and worth enabling. RTX 3060 and higher from NVIDIA, RX 6600 XT and higher from AMD. These enable ray tracing at reasonable settings. For maximum quality, RTX 4070+ or equivalent. Ray tracing visibly improves image quality; it's worth the investment.

Q: Is DLSS 3 frame generation worth the latency? A: Depends on the game. For single-player, story-driven games (Alan Wake 2, Cyberpunk), frame generation is excellent—doubles frame rate with minimal latency. For competitive multiplayer, the input lag is debatable. Professional esports players often disable it. For casual multiplayer, it's fine.

Q: Will path tracing fully replace rasterization? A: Unlikely in the next 5 years. Hybrid approaches (rasterized geometry with ray-traced reflections/shadows) will remain standard because they're faster. Full path tracing is visually superior but expensive. As hardware improves, the balance will shift toward more path tracing, but rasterization as a foundational step will probably persist.

Q: Can I use real-time ray tracing techniques for offline film rendering? A: Yes, and increasingly studios do. Rendering at lower sample counts plus AI reconstruction can accelerate offline rendering. However, offline rendering prioritizes quality over speed, so this trades time for potential artifacts. It's useful for previews and iterative work, less so for finals demanding perfect quality.

Q: What's the difference between hardware ray tracing and software ray tracing? A: Hardware ray tracing uses specialized GPU units (RT cores) for ray-scene intersection, significantly accelerating it. Software ray tracing uses standard GPU compute. Hardware is 10-50x faster. All modern GPUs (RTX, RDNA, Arc) have hardware ray tracing.

Related Resources

For comprehensive information on real-time rendering and GPU technology, our GPU Cloud Render Farm guide covers hardware selection for modern rendering workloads. We also support Unreal Engine 5's Lumen and ray-traced rendering on our infrastructure.

For game developers exploring ray tracing, our Blender Cloud Rendering guide covers ray-traced rendering in a different context, but principles transfer.

The Convergence Accelerating

In 2018, Quake II RTX was a novelty: "Look, we can ray-trace in real-time!" In 2026, ray tracing is an expectation. The novelty has shifted: full path tracing, frame generation, AI reconstruction—these are the new frontier.

What's most interesting is the philosophical shift. For decades, real-time and offline rendering were separate domains. Game engines and film renderers used different algorithms, different hardware, different optimization strategies. That separation is collapsing.

In 2026, advanced real-time game engines use techniques from film rendering. Established film render farms use optimization strategies from games. GPUs handle both contexts. Materials and shaders are shared between real-time and offline.

This convergence will accelerate. The next five years will see more path tracing in games, more real-time techniques in offline rendering, and fewer clear boundaries between the two.

For artists and developers, this means your skills are increasingly transferable. Learn ray tracing, path tracing, DLSS, and denoising in one context, and they're valuable in another. The industry is moving toward a unified rendering philosophy: physically-based, ray-traced, AI-enhanced, and pragmatically optimized for the context.

About Thierry Marc

3D Rendering Expert with over 10 years of experience in the industry. Specialized in Maya, Arnold, and high-end technical workflows for film and advertising.