Art Production9 min read

Custom Shader Development for Games: Holographic Effects, Procedural Systems, and Performance

Every distinctive visual style in games — the cel-shading of Genshin Impact, the holographic UI of Cyberpunk, the procedural destruction in Noita — is built on custom shaders. Stock materials and built-in effects can get you 80% of the way, but that last 20% of visual identity requires technical art: the discipline of writing GPU programs that create specific visual effects within specific performance budgets.

This guide covers what tech art actually involves, how custom shaders work in Unity and Unreal, performance budgets you need to respect, and when it makes sense to invest in custom shader development versus using off-the-shelf solutions.

What Technical Art Is and Why It Matters

Technical art sits at the intersection of engineering and visual design. A tech artist doesn't just make things "look good" — they make things look specific while running at target frame rate on target hardware.

The role covers: - Shader development — writing GPU programs for custom visual effects - Material systems — building reusable material libraries that artists can configure without code - VFX pipelines — particle systems, fluid effects, destruction systems - Procedural content — algorithms that generate textures, geometry, or animations at runtime - Performance optimization — profiling GPU workloads, reducing overdraw, managing shader complexity - Tool development — custom editor tools for artists (texture bakers, LOD generators, material previewers)

Studios without dedicated tech art rely on asset store shaders or engine defaults. This works for prototypes, but shipping a game with a distinctive visual identity almost always requires custom shader work.

Shader Types: What Runs on the GPU

Surface / Fragment Shaders

The most common type. Surface shaders define how a material responds to light — its color, roughness, metallic properties, normal mapping, emissive glow, transparency. Every object you see in a game is rendered by a surface shader.

Use cases: Custom material effects (holographic surfaces, stylized toon shading, iridescence, subsurface scattering for skin), environment-reactive materials (snow accumulation, wetness), damage visualization.

Vertex Shaders

Vertex shaders manipulate the geometry of a mesh before it's rasterized. They can move, scale, and transform vertices — enabling effects that change the shape of objects in real-time.

Use cases: Wind animation on vegetation, water surface displacement, character hit reactions, vertex-based animation (cloth, flags, hair), melting/dissolve effects, terrain deformation.

Post-Processing Shaders

Applied after the scene is rendered, post-processing shaders modify the final image. They operate on the full screen framebuffer, not individual objects.

Use cases: Color grading, bloom, depth of field, vignette, screen-space reflections, screen-space ambient occlusion, stylized outlines, glitch effects, underwater distortion.

Compute Shaders

General-purpose GPU programs that don't directly render pixels. Compute shaders process data in parallel — useful for simulation and data-heavy tasks.

Use cases: GPU particle systems (millions of particles), fluid simulation, cloth simulation, procedural mesh generation, AI pathfinding on GPU, real-time grass/vegetation placement.

Unity: ShaderLab, Shader Graph, and Custom HLSL

Unity offers three approaches to shader development, each with different tradeoffs:

Shader Graph (Visual)

Unity's node-based shader editor. You connect nodes to build shaders visually, without writing code. Shader Graph generates HLSL under the hood and works with both URP and HDRP.

Pros: Accessible to artists, fast iteration, visual preview in real-time. Cons: Limited to what the node library supports. Complex logic (custom lighting models, advanced noise functions, multi-pass effects) is difficult or impossible without custom nodes.

Best for: Material variations, simple-to-moderate effects, artist-driven material authoring.

ShaderLab + HLSL (Code)

Unity's shader language wraps HLSL in a ShaderLab structure that defines properties, subshaders, passes, and fallbacks. You write the vertex and fragment programs in HLSL within the ShaderLab file.

Pros: Full control over the rendering pipeline, can implement any effect the GPU supports, better performance optimization (you control every instruction). Cons: Steeper learning curve, no visual preview during authoring, debugging is harder.

Best for: Custom lighting models, advanced effects, performance-critical shaders, multi-pass rendering.

Custom Shader Graph Nodes

The bridge between visual and code approaches. You write custom HLSL functions and expose them as nodes in Shader Graph. Artists get the visual workflow, engineers get the custom logic.

Best for: Teams where artists build materials and engineers provide the building blocks.

Unreal: Material Editor, Custom HLSL, and Material Functions

Unreal Material Editor (Visual)

Unreal's node-based material system is more powerful than Unity's Shader Graph. It supports complex logic, custom HLSL expressions within nodes, and a rich library of built-in functions.

Pros: Extremely capable — most effects can be built entirely in the Material Editor. Real-time preview, material instances for artist-friendly parameter tuning, automatic LOD and platform scaling. Cons: Complex material graphs become difficult to read and maintain. Performance characteristics aren't always obvious from the visual graph.

Custom HLSL in Unreal

For effects beyond the Material Editor's capabilities, you can write custom HLSL and integrate it via the Custom Expression node or the full Global Shader system. The Global Shader pipeline gives you access to the entire rendering pipeline — compute passes, custom render targets, multi-pass effects.

Best for: Custom render features (outline rendering, volumetric effects, screen-space techniques), compute shader workloads, engine-level rendering modifications.

Material Functions and Material Layers

Unreal's system for reusable shader logic. Material Functions are subgraphs that can be shared across materials. Material Layers allow you to blend multiple materials on a single surface with automatic weight mapping.

Best for: Large projects with many materials that share common effects (weathering, damage, snow accumulation).

Hologram Shader Breakdown

Holographic effects are one of the most requested custom shader types. Here's the anatomy of a production hologram shader (the type of effect we built for our hologram shader project):

Core Components

1. Fresnel rim glow: Bright edges that fade toward the center of the surface. Calculated from the dot product of the surface normal and view direction. Higher values at glancing angles create the characteristic edge glow.

2. Scanlines: Horizontal lines that scroll across the surface, simulating a CRT or projection effect. Implemented as a sine wave mapped to world-space Y position, scrolling over time. Line thickness and spacing are artist-configurable properties.

3. Noise distortion: Perlin or simplex noise that offsets UV coordinates and vertex positions, creating the jittery, unstable look of a holographic projection. Two noise layers at different scales and speeds create natural-looking interference.

4. Alpha flickering: The overall opacity pulses randomly, simulating signal instability. Implemented as a combination of sine waves at irrational frequency ratios (so the pattern never visibly repeats).

5. Color shift: The base color shifts between two or three hues over time, simulating chromatic interference. Often blue-to-cyan for sci-fi holograms, or warm tones for magical/fantasy variants.

6. Vertex displacement: Vertices are slightly displaced along their normals based on noise, creating a subtle "breathing" or "wavering" effect. Keep displacement small (0.5-2cm) to avoid breaking silhouettes.

Performance Cost

A well-optimized hologram shader costs approximately: - Vertex shader: 0.02-0.05ms per object (noise + displacement) - Fragment shader: 0.05-0.15ms per object at 1080p (fresnel + scanlines + noise + alpha) - Total per hologram object: 0.07-0.2ms

At a 16.6ms frame budget (60fps), a single hologram object costs about 1% of your frame. Ten hologram objects push to 10%. Budget accordingly.

Performance Budgets for Custom Shaders

The most common mistake in custom shader development is building beautiful effects that tank frame rate. Every shader must be designed within a performance budget.

GPU Time Budget (Per Frame)

At 60fps, you have 16.6ms total. Of that, shaders typically get: - Opaque geometry (all materials): 4-6ms - Transparent geometry: 1-2ms - Post-processing (all effects): 2-3ms - Shadows: 1-2ms - UI: 0.5-1ms

A single material shouldn't cost more than 0.3ms on screen. If it does, it needs optimization or it needs LOD variants that switch to a simpler shader at distance.

Common Performance Killers

Overdraw: Transparent objects rendered on top of each other. Each layer multiplies the shader cost. A stack of 5 transparent hologram objects is 5x the fragment cost. Use depth sorting and limit transparent layer count.

Texture samples: Each texture lookup costs GPU time. A material with 8 texture samples is expensive. Combine textures into channel-packed atlases (roughness in R, metallic in G, AO in B, emissive in A).

Complex math in fragment shaders: Pow, sin, cos, sqrt are expensive per pixel. Move calculations to vertex shader where possible (runs per vertex instead of per pixel — usually 100-1000x fewer invocations).

Fullscreen post-processing: A post-process effect runs on every pixel on screen. At 1920x1080, that's 2 million pixels. At 4K, it's 8.3 million. Every instruction in a post-process shader is multiplied by millions.

LOD Strategy for Shaders

Just as mesh LOD reduces polygon count at distance, shader LOD reduces material complexity. Define multiple SubShaders or material variants: - LOD 0 (close): Full-quality shader with all effects - LOD 1 (medium): Remove expensive effects (noise distortion, vertex displacement) - LOD 2 (far): Simple unlit color with alpha, no per-pixel effects - LOD 3 (very far): Billboard or impostor

Procedural Shader Systems

Procedural systems generate visual content algorithmically instead of from authored textures. They save memory (no texture data), scale to any resolution, and create infinite variation.

Common Procedural Techniques

Noise functions: Perlin, Simplex, Voronoi, Worley noise — the building blocks of procedural textures. Layer multiple noise octaves at different scales for natural-looking patterns (terrain, clouds, wood grain, stone).

Distance fields: Signed distance functions (SDFs) define shapes mathematically. Used for procedural UI, text rendering, and smooth organic shapes. Raymond tracing through SDFs creates smooth, infinite-detail surfaces.

Wave functions: Sine, triangle, and sawtooth waves create repeating patterns — scanlines, energy fields, pulse effects, loading animations. Combine multiple waves at different frequencies for complex patterns.

Cellular automata: Grid-based rules that generate emergent patterns. Used for procedural growth (moss, rust, crystal formation), terrain generation, and organic spread effects.

When to Invest in Custom Shaders vs Off-the-Shelf

Use off-the-shelf when: the effect is standard (PBR materials, basic water, simple particles), the project budget is under $200K, or the visual style doesn't require a distinctive look.

Invest in custom shaders when: visual identity is a selling point (stylized games, branded experiences), performance is critical (mobile, VR — where every millisecond matters), or you need effects that don't exist in any asset store (novel interaction feedback, data visualization, brand-specific materials).

How WODH Approaches Tech Art

Our 3D Art & Design team includes dedicated tech artists who work across Unity and Unreal. We've built holographic shader systems, procedural environment materials, custom VFX pipelines, and performance-optimized material libraries for mobile and VR. For studios looking to outsource game art including tech art, we integrate directly into your pipeline.

Tech art is where visual quality and technical performance meet. If your project needs a distinctive look that runs at target frame rate on target hardware, we can build the shader systems and material pipelines to deliver it.

Written by WODH Team