Procedural Backrooms: Lighting, Regeneration, and PS1-Era Tricks

Sunday, Jun 8, 2025 | 3 minute read | Updated at Sunday, Jun 8, 2025

@

Creating a procedural PS1 style Backrooms game

In one of my recent prototypes, I built a procedural Backrooms-style game with infinite, regenerating spaces. The goal was to maintain a continuous sense of disorientation — not just by layout, but through the systems themselves: non-Euclidean structure, minimalistic lighting, and retro visuals. Here’s how I approached the core problems and what I had to build to support them.


Continuous Room Generation in Non-Euclidean Space

The core of the prototype revolves around a non-Euclidean space — rooms aren’t persistent. When you leave a room, it may regenerate into something entirely different the next time it’s visited. This means I can’t rely on traditional spatial data structures. The system tracks a small number of active rooms and reuses room instances based on player position and movement vector rather than coordinates.

Each “room” is generated procedurally using a graph-based structure (with plans to eventually plug this into my larger node system). I maintain a lightweight descriptor for each room’s last state, allowing for partial reuse or full regeneration as needed.


Problem: Lighting Without Killing Performance

A big challenge was lighting. I wanted to capture the feel of flickering fluorescents — harsh, artificial light coming alive as the player moves through. But dynamic lighting (or raycasts to trigger it) scales poorly when the environment is continuously generating.

✅ Solution: Vertex-Colored Lighting

I built a custom lighting system based entirely on mesh vertex colours. Instead of Unity’s built-in lights, I baked light “intensity” into the mesh directly. Each vertex holds its own emissive data, and lights are “animated on” over time using a coroutine system tied to the player’s movement through space.

This eliminated the need for runtime raycasts entirely. When a player enters a new room, the lighting system interpolates the vertex colours over time to simulate lights flickering to life.


Problem: Vertex Density vs Retro Fidelity

The catch with vertex-coloured lighting is that large flat planes have too few vertices to represent any meaningful lighting gradients. If I left the floors and walls as single quads, I’d get lighting artifacts or uniform color blocks — not the look I was going for.

✅ Solution: Break the Mesh — Strategically

I ended up fragmenting the mesh planes into smaller quads, creating more vertices and thus more lighting data points. But to avoid killing performance (especially on slower systems), I leaned into a PS1-style visual: low-poly geometry, aliased edges, and fixed camera movement. The style gave me breathing room to optimize — lower fidelity was not just acceptable, but stylistically appropriate.


Optimization Notes

  • Room instances are pooled and recycled.
  • Mesh generation is done on background threads (where possible), then applied on the main thread.
  • Lighting updates are purely CPU-side and use coroutines, avoiding per-frame calculations or Unity’s lighting pipeline.
  • Because everything is vertex-based, there’s no reliance on shaders or post-processing for lighting effects.

What’s Next?

I’m planning to integrate the room and lighting systems into a full node-based procedural generation framework. Eventually, this prototype will serve as a real-world testbed for multithreaded, graph-driven content — something closer to what you’d use in a full procedural game architecture.


TL;DR

  • Procedural, regenerating Backrooms-style rooms in a non-Euclidean space
  • Custom vertex-colour-based lighting to avoid runtime raycasts
  • Meshes subdivided to support low-fidelity lighting detail
  • PS1 aesthetic used deliberately to justify technical constraints
  • Built with performance-first mindset, optimized for continuous generation

If you’re building procedural spaces and want a lightweight, stylized approach to dynamic lighting, vertex colours + mesh fragmentation might be the combo you’re overlooking.

Let me know if you’d like a version of this prototype to test, or follow along as I fold it into the larger Procedural Labs graph system.

© 2025 Procedural Labs

🌱 Powered by Hugo with theme Dream.

About Jasper

I’m a developer and systems designer specializing in procedural generation — from terrain and architecture to runtime logic and asset pipelines. For over a decade, I’ve been building tools and frameworks that enable dynamic content in games, with a focus on performance, modularity, and control.

Procedural Labs is my space to share experiments, devlogs, and prototypes — often built around graph-based systems, multithreaded generation, and stylized runtime rendering. I care about systems that are expressive for developers, friendly for designers, and efficient under load.

Whether it’s non-Euclidean space generation, Voronoi-based city layouts, or custom lighting via mesh data, I build tools that treat procedural generation as both a creative and engineering discipline.

Social Links