Uncovering the Nuance Behind Photorealistic Art

TLDR; Realism in game art isn’t about copying reality, but about believability. Jeroen reflects on how the tools, techniques and workflows for this category have evolved, and why adaptability is crucial to keep digital art authentic.  

Realism isn’t about how accurate it is but how believable it feels. Research and reference gathering are a part of it, but the real skill is understanding which details sell the illusion. Often, it’s the ones that players never consciously notice but would immediately miss if they weren’t there. This could be subtle wear on edges, variations in the same material, or the way light breaks across an uneven surface.  

For me, realism means embracing the imperfections of reality. It’s not just about surface details like dust, scratches, or fingerprints, but the irregularities in shape and construction. This can be a beam that isn’t perfectly straight, a chair leg that wobbles slightly, or bricks that sit unevenly in a wall. Those cues make objects feel like they exist in the world, instead of being too perfect or manufactured. 

A sci-fi environment project we’ve worked on. See more on Art Station.

In the early days, we had to fake most of this. We desaturated photos and overlaid them in Photoshop just to squeeze variations into a diffuse map. Models themselves were simpler, so a lot of the realism then was directly painted into textures. 

ZBrush was a turning point. For the first time, we were able to sculpt millions of polygons, add wrinkles, pores, and chipped edges, then bake all that richness into normal maps. Props and characters gained a depth that painting alone couldn’t achieve. 

Substance Designer and Painter took things further, redefining how we approached materials. We could layer dirt, wear and scratches procedurally, yet still direct them artistically. When PBR pipelines became the standard, materials finally behaved in a physically correct way. Metal reflected, wood absorbed, and light interacted naturally. 

Photogrammetry and scanning pushed realism even further. Libraries like Quixel Megascans brought real-world rocks, foliage, and surfaces straight into our pipelines, capturing natural complexity that would be far too time-consuming to hand-craft at scale. For characters, high-resolution skin and body scans introduced pores, wrinkles, and subtleties that made faces believable. Whether a cliff face or a close-up of a cheek, scans became the foundation for building realism. 

Now, with Unreal’s Nanite and Lumen, we can bring in high res sculpts and scans without worrying about retopology or light baking. This technology lets us focus more on artistry than constraints 

Looking ahead, the next breakthroughs will come from smarter proceduralism and AI-assisted workflows. Tools that don’t just generate complexity but let us direct it. Just like how ZBrush, Substance, and photogrammetry opened new doors, AI and procedural systems will continue expanding what’s possible, giving artists greater freedom to push creativity further. 

It’s the result of collaboration, especially at the production level. The challenge is keeping everyone aligned to the same artistic vision.  

On a single asset, one artist can take it from blockout to final. But at full production scale, realism only happens when multiple disciplines come together. Technical artists and engineers define pipelines and performance budgets. Material specialists set up libraries to ensure consistency. Environment and prop artists create within those systems, while art direction ties everything back to a unified look and feel. 

In my experience, especially in outsourcing, workflows can vary across studios, from AAA to indie. Adapting to pipelines, matching benchmarks, and maintaining visual consistency always come down to communication. Shared references, clear documentation, and regular feedback loops are what keeps teams aligned to the same artistic vision, whether it’s a handful of props or an entire environment.  

Procedural tools like Houdini and Substance Designer can make the creative process more artistic when the content is “directable.” For me, it’s always been about control and flexibility. A good procedural setup gives the artist intuitive controls to shape the outcome. This includes guiding randomness or complexity without feeling boxed in. If a tool starts dictating the result, it stops serving the vision. 

Take Houdini, for example. An undirected set up might scatter rocks randomly in a canyon. A directed one lets me say, “Okay, fewer rocks near the cliffs, more erosion down in the riverbed.” That’s when the tool feels like an extension of artistry. 

The same happened with Substance Designer. Once it exposed parameters like edge wear or rust spread, materials became art-directable instead of purely procedural. The key is transparency: I want to know what’s happening under the hood so I can make deliberate choices. Tools should extend creativity, not replace it. 

We’ve always been driven to close the gap between what’s pre-rendered and what’s possible in real time. Every generation of tech, from normal maps to PBR to ray tracing, has pushed us closer. That same ambition shapes how I work: constantly experimenting, testing, and adapting new tools to fit my workflow. If something helps me create better or faster, it naturally becomes something I can share with others on the team.  

Every new wave of tech brings both opportunities and complexity. Sometimes it means achieving realism faster. What once took days of hand-painting in Photoshop can now be done in minutes using smart materials in Substance Painter. Other times, it about pushing quality further like sculpting intricate damage in ZBrush and baking it into a game-ready asset.  

My ultimate goal is to stay adaptable and experimental, making realism practical, so I can focus on creativity instead of fighting technical limitations.