Eight Years of BuildR: The Evolution of a Procedural Generation Toolkit

Thursday, May 8, 2025 | 6 minute read | Updated at Thursday, May 8, 2025

@

🏗️ From “Redacted” to BuildR: The Origins of a Procedural Tool

Back in 2012, I was deep into development on an ambitious indie game called Redacted — a stealth UAV drone simulator set within a dynamically generated city. The core idea was bold for the time: fly surveillance drones over procedurally built cityscapes, gather intel, and operate in an environment that never repeated itself. It was urban, systemic, and pushed Unity’s tooling hard in every direction.

And like many ambitious indie projects, it eventually hit the wall.

The scope kept expanding. Runtime terrain, AI, mission logic, flight control, city logic — every system I built uncovered the need for three more. At a certain point, I realized I had built a solid tech base for a game I couldn’t feasibly finish with the time and resources I had. But I wasn’t ready to walk away from it entirely.

There was one part of the project that consistently felt good: the building generation tech.

It wasn’t just prototyping — the system could generate usable geometry, complete with materials, floor control, and stylized façades. It had a visual rhythm. The mesh pipeline was reliable. I had something real. So rather than abandon it, I split it out. And that’s where BuildR began.

🎬 Lessons from Camera Path 2

A few months earlier, I had released another tool called Camera Path 2 on the Unity Asset Store. It let users build cinematic camera moves directly in-editor — and it did surprisingly well. That experience was eye-opening.

For the first time, I saw the power of good Unity tooling: clean UX, thoughtful feature scope, and polished presentation. It didn’t need to be a “game” to be valuable — developers just needed the right tools to help them make their games. And I realized that BuildR could be one of those tools.

The Unity Asset Store in 2013 had almost no procedural building systems. If you wanted city buildings, you either hand-modeled everything in a DCC tool or bought fixed meshes off the store. There was a clear gap, and I was sitting on a working solution.

So I shifted gears — Redacted was shelved, and BuildR took center stage.

đź§° BuildR 1: Designing for the Editor

The first version of BuildR was an in-editor building creation tool. You could define a floorplan with 2D handles, extrude it upward with floor-by-floor control, and apply styles via modular presets. It wasn’t runtime — the goal was to speed up urban modeling workflows for level designers and solo developers.

Unity’s UI system at the time was still EditorGUILayout-based, but I spent a lot of time getting the UX right. Menus were organized, handles behaved predictably, and users could get from zero to building in a few minutes. Camera Path had taught me that polish mattered, especially when competing on the Asset Store.

I also brought the same mindset to the store presence itself — promo art, documentation, video tutorials. The polish helped signal that this was a serious tool, not a weekend script dump.

🧑‍💻 Early Feedback & Community Uptake

The launch of BuildR was solid but slower than I hoped. It wasn’t an instant hit, but it gained traction steadily. And the feedback was promising — developers liked the workflow, and they especially liked not having to jump between Unity and Blender just to block out a city.

There were a few consistent feature requests early on:

  • “Can it support interiors?”
  • “What about curved walls or rounded corners?”

These requests would become the seeds of BuildR 2…

đź§± BuildR 2: Runtime Reinvention and User-Driven Expansion

BuildR 2 wasn’t just a feature update — it was a ground-up rethink of the tool’s role. Where BuildR 1 had focused on level design in-editor, BuildR 2 aimed to be a true procedural engine — one that could run in real-time, generate buildings on demand, and scale across different use cases, from city simulators to ambient urban backdrops.

đź§  Rethinking the Data Model

BuildR 1 had relied heavily on ScriptableObjects to store building data — a natural choice for editor-based workflows. But they weren’t built for runtime.

In BuildR 2, I introduced a new internal asset model: lightweight, thread-safe, and decoupled from Unity’s object lifecycle. These new data classes were:

  • Fast to instantiate
  • Thread-safe
  • Free from AssetDatabase constraints

⚙️ Multithreaded Mesh Generation

All mesh prep — shape planning, vertex layout, UVs — was offloaded to background threads. Final mesh application still occurred on the main thread, but the heavy lifting was done ahead of time.

This enabled true runtime city generation with minimal performance cost.

đź§° UX Overhaul

Unity’s UI tooling had matured, and BuildR 2 took full advantage. The editor was rebuilt for clarity and speed, with clean panels for:

  • Volumes
  • Façades
  • Styles
  • Roofs
  • Decorative elements

🏢 Interiors and Curved Façades

User feedback drove two key features:

  • Interior generation: volumetric logic for wall thickness, internal rooms, and floorplans
  • Curved façades: arc definitions and bezier control for non-orthogonal architecture

These systems integrated cleanly into the mesh pipeline, supporting details like curved chimneys, modular doors, and window rows.

🏗️ Editor vs Runtime: One System, Two Faces

A shared data core allowed users to design in-editor, export configs, and generate variations at runtime. BuildR became a hybrid — part editor utility, part runtime engine.


⚡ BuildR 3: Decoupled Architecture and Runtime-First Design

Unity’s ecosystem changed rapidly — prefabs, addressables, DOTS — and so did procedural needs. BuildR 3 was a rewrite for the modern era.

đź§± Pure Data Assets

ScriptableObjects were replaced entirely with custom data models — JSON-ready, serializable, network-safe.

This made BuildR 3 suitable for:

  • Cloud-driven generation
  • Server-authoritative world building
  • Multiplayer-ready content streaming

⚙️ Full Threaded Pipeline

Mesh generation happened in memory, using background jobs. Only the final mesh creation touched Unity’s main thread. This made it possible to generate:

  • Thousands of buildings in parallel
  • Streamed content with no framerate impact
  • LOD-ready geometry for large maps

đź§© Modular Volume System

A new hierarchical system defined:

  • Base volumes
  • Roof/cap volumes
  • Façade logic
  • Decorative detail layers

You could now define rulesets and assemble buildings from themes, enabling consistent procedural neighborhoods.

đź§Ş Unexpected Use Cases

Developers used BuildR 3 for:

  • Multiplayer games with network-synced cities
  • AI testing for pathfinding in dynamic buildings
  • Architecture visualization with real-time user input

đź§  Eight Years, Three Versions, and One Guiding Principle

BuildR has always pursued practical proceduralism: give developers control, make systems fast and scalable, and avoid black-box magic.

What I Learned

  • Threading only works when planned from the start
  • Good UX makes or breaks adoption
  • Procedural tools succeed when they empower authorship

Where It Fits Today

BuildR is now part of a larger system: Procedural Labs — a graph-based generation framework for:

  • Terrain
  • Buildings
  • Props
  • Runtime logic
  • Stylized content pipelines

BuildR taught me that procedural generation is less about randomness, and more about control. And it’s still evolving.


Thank you to everyone who’s used BuildR over the years — and if you’re just discovering it, welcome.

There’s more to come.

© 2025 Procedural Labs

🌱 Powered by Hugo with theme Dream.

About Jasper

I’m a developer and systems designer specializing in procedural generation — from terrain and architecture to runtime logic and asset pipelines. For over a decade, I’ve been building tools and frameworks that enable dynamic content in games, with a focus on performance, modularity, and control.

Procedural Labs is my space to share experiments, devlogs, and prototypes — often built around graph-based systems, multithreaded generation, and stylized runtime rendering. I care about systems that are expressive for developers, friendly for designers, and efficient under load.

Whether it’s non-Euclidean space generation, Voronoi-based city layouts, or custom lighting via mesh data, I build tools that treat procedural generation as both a creative and engineering discipline.

Social Links