MrDocs Bootstrap: One Script to Build Them All

Portrait of Alan de Freitas Alan de Freitas · Apr 15, 2026

When new developers joined the MrDocs team, we expected the usual ramp-up: learning the codebase, understanding the architecture, and getting comfortable with the review process. What we did not expect was that building and testing the project would be the hardest part. People dedicated to the project full-time spent weeks just trying to get a working build. Even when they succeeded, each person ended up with their own set of workarounds: a custom script here, a patched flag there, an undocumented environment variable somewhere else. One unrelated commit from someone else could silently break another developer’s local setup. And even after all of that, they didn’t know how to run the commands to test the project.

As the complexity grew, we naturally reached for a package manager. We adopted vcpkg, but over time we discovered that our problem was too complex for what any package manager is designed to handle. The build type combinations, the sanitizer propagation, the cross-platform toolchain differences, and the IDE configurations: these are workflow problems that kept accumulating. That realization, combined with an onboarding crisis where new contributors could not build the project at all, led us to write our own bootstrap script. The idea was not unfamiliar: at the C++ Alliance, we work closely with the Boost libraries, and Boost has shipped a bootstrap script for years. We knew the pattern worked. We just needed to apply it to our own dependency problem.

This post explains why robust C++ workflows are fundamentally difficult, not only for dependency management but also for supporting multiple platforms, compilers, and testing configurations. It describes what we learned from our experience with vcpkg and how a bootstrap script solved the problem for MrDocs.

Why Dependency Management Is Hard

A Combinatorial Explosion

Suppose your project depends on Package A >=1.0 and Package B >=2.0, but all options where A >=1.0 require B <=1.5. You are stuck. With hundreds of packages, each with multiple versions and possibly conflicting or conditional dependencies, the problem explodes combinatorially.

This is not hyperbole. Package dependency resolution is NP-complete. It reduces directly to the Boolean satisfiability problem (SAT). Each package version is a boolean variable, each dependency constraint is a clause, and finding an installable set is equivalent to finding a satisfying assignment. Real-world tools handle this with heuristics (like APT and pip) or outright SAT solvers (like libsolv, used by DNF and Zypper).

In the worst case, finding a consistent set of dependency versions requires exponential time. Verifying one is polynomial, but discovering it may not be.

How other ecosystems hide this complexity

Most users never notice this because package managers use tricks. When npm cannot satisfy all constraints, it installs multiple versions of the same package in nested node_modules directories, so both versions get bundled into the final application. Cargo does something similar: when two crates require SemVer-incompatible versions of the same dependency, it includes both in the build. Most users are not aware this is happening, and would probably not be happy about it if they were: bundling two versions of the same library increases binary size, can cause subtle bugs when types from different versions interact, and makes the dependency graph harder to reason about. In C++, the trick is not even available. You cannot link two versions of the same library into a single binary. When the constraints are unsatisfiable, there is no quiet fallback. You get a build error.

Why C++ Makes It Worse

  • No standard package format: unlike npm, pip, or Cargo, C++ has no universal package format. Every dependency must be compiled with compatible settings, and pre-built binaries are the exception rather than the rule.
  • ABI compatibility: different compilers, compiler versions, and even compiler flags can produce incompatible binaries. You cannot just link any two object files together.
  • API compatibility: header-only and compiled libraries have different concerns. Template instantiation happens at the consumer’s compile time, so a header-only library can break when the consumer’s compiler or standard library version changes, even if the library itself has not changed.
  • Categorical options: choices like shared/static linking, exceptions on/off, and RTTI on/off need to be consistent across the entire dependency chain. If one library is built with exceptions disabled and another expects them, you get subtle runtime failures or linker errors.
  • Viral flags: some flags must propagate to all dependencies, some must not, and some are optional per dependency. Build type is a good example of the nuance. You might want MrDocs in Debug, but building LLVM in Debug makes it too slow to use. Building LLVM in Release makes it too hard to debug when the bug is in LLVM. So you end up with combinations like “MrDocs in Debug, LLVM in Debug with optimization, everything else in Release.” The propagation decision can vary per dependency and per situation.
  • Viral macros: preprocessor macros can create different binary versions of the same library. If spdlog depends on fmt with a certain macro configuration, every other dependency on fmt in the hierarchy has to use the same macro. A macro used in fmt might affect the macros available in peer dependencies that define fmt formatters. This creates constraints that propagate transitively through the dependency graph, and no package manager tracks macro configurations.
  • Sanitizer propagation: sanitizers deserve their own mention because they are not all equally viral. UndefinedBehaviorSanitizer is the lightest: it relies on compile-time checks and can share the same dependency builds as the unsanitized configuration. AddressSanitizer, MemorySanitizer, and ThreadSanitizer each need their own separately built LLVM with instrumented dependencies. ASan and MSan go further: they also require an instrumented libc++ built as an LLVM runtime. MSan is the extreme case. It reports false positives on any uninstrumented code, so the entire chain has to be instrumented: first build the C++ standard library with MSan, then build all dependencies against that instrumented standard library, then build MrDocs itself. That is three layers of builds with a single flag threading through all of them. No package manager models these propagation levels.
  • Build type incompatibility: on MSVC, Debug and Release are ABI-incompatible at the CRT level. This means you cannot just build your dependencies in Release and your project in Debug for a faster development cycle. You need all of them on the same side of the Debug/Release boundary. A Debug build with optimization (“OptimizedDebug”) is structurally different from a Release build with debug symbols (“RelWithDebInfo”). The first uses the Debug CRT with /O2; the second uses the Release CRT with debug info. Mixing them causes linker errors. This forces you into configurations that no standard build type represents.
  • Platform-specific toolchain setup: each platform has its own way of locating and configuring compilers. On Linux, GCC and Clang are on PATH. On macOS, Homebrew Clang installs toolchain components (llvm-ar, llvm-ranlib, ld.lld) and its standard library (libc++) in non-standard locations that differ from AppleClang’s. The headers and libraries are not on the default search path, so you have to pass their locations explicitly through compiler and linker flags for everything you compile. On Windows, MSVC does not live on PATH at all: it requires environment variables set by vcvarsall.bat, and locating the correct Visual Studio installation requires vswhere.exe. None of this is handled by package managers.
  • Compiler and standard library combinations: on Linux, Clang uses whatever libstdc++ is installed on the system rather than shipping its own. Ubuntu 24.04 ships GCC 13, but MrDocs needs GCC 14 features (like <print>). So a developer using Clang 20 on a fresh Ubuntu machine gets build errors from the standard library, not from their own code. Testing every Clang version with every GCC’s libstdc++ is infeasible, but specific combinations matter, and the mismatch is not obvious to the developer when it happens.
  • Platform explosion: Windows/Linux/macOS multiplied by Debug/Release/OptimizedDebug, GCC/Clang/MSVC/AppleClang, shared/static, and sanitizer variants creates a combinatorial explosion of configurations that all need to be tested. Each platform also has its own quirks: git symlinks behave differently on Windows, Ninja availability varies, and even the way you specify compiler flags differs between MSVC and GCC/Clang.
  • Conditional dependencies: in C++, build options frequently add or remove entire dependencies. An image processing library might support PNG, JPEG, and WebP, each requiring its own codec library. Enabling or disabling a format changes the dependency graph. Build scripts also commonly look for host dependencies (system libraries for talking to the OS, GPU, or network) that you are not expected to build yourself but that must be present on the machine. The dependency graph is not static; it depends on the configuration.
  • Closed-source dependencies: all of the problems above assume you have the source code and can rebuild with the correct flags. Sometimes you do not. When a dependency is distributed only as a pre-built binary, there is no way to adjust the ABI, propagate sanitizer flags, or change the build type. If it was compiled with incompatible settings, there is nothing you can do about it. It becomes a hard constraint on the entire system.
%%{init: {"theme": "base", "themeVariables": {"primaryColor": "#f7f9ff", "primaryBorderColor": "#9aa7e8", "primaryTextColor": "#1f2a44", "lineColor": "#b4bef2", "secondaryColor": "#fbf8ff", "tertiaryColor": "#ffffff", "fontSize": "14px"}}}%% mindmap root((C++ Dependencies)) No Standard Format Built from source Closed-source binaries Compatibility ABI API / Templates Build Type / CRT Propagation Viral flags Viral macros Sanitizers Categorical options Dependencies Conditional on build options Host / system libraries Closed-source binaries Platform Toolchain setup Compiler + stdlib combos Combinatorial explosion

In C++, the general case involves so many dimensions that no existing tool handles all of them well.

What about CPS?

The Common Package Specification (CPS) is an interesting effort to standardize how C++ packages are consumed. A .cps file describes everything a build system needs to find and link against an already-built package: include paths, library paths, compiler flags. This is valuable, but it operates at the point of consumption, where we have already made all the decisions about platform, compiler, build type, and sanitizers. It assumes the dependency has already been built in a compatible way. It does not describe how to build the dependency with the correct flags in the first place. For example, if we need AddressSanitizer, all dependencies must be built with ASan instrumentation. A CPS file tells us how to consume a package that was built with ASan, but it does not know how to rebuild that package with ASan if it was not. The problems described above are all about making those upstream decisions correctly, which happens before CPS enters the picture.

What Went Wrong for MrDocs

MrDocs depends on LLVM, Duktape, Lua, and libxml2 (and previously also fmt). Over time, three categories of problems accumulated.

Where vcpkg Fell Short

For over a year, we used vcpkg to manage these dependencies. MrDocs is a tool, not a library, so we only needed vcpkg for acquiring our own dependencies rather than for making ourselves easy to consume downstream. It worked at first, but the complexity of our workflows gradually outgrew what vcpkg was designed to handle:

  • Build types: MrDocs developers frequently need a Debug build with optimization enabled because the codebase is large enough that an unoptimized debug build is painfully slow. On MSVC, Debug and Release are ABI-incompatible, so a “Debug with optimization” configuration does not fit neatly into vcpkg’s Debug/Release binary model.
  • Patches and dual paths: vcpkg applies patches to libraries that do not follow CMake conventions. This meant we had to support two ways to find the same library: the vcpkg-patched version and the upstream version. When libraries do follow CMake conventions, we do not need vcpkg as much. But when they do not, the patches make vcpkg less useful rather than more. Contributors kept opening PRs proposing yet another way to locate a dependency. In a build script, every new path is expensive to test.
  • Rigid baseline: vcpkg’s baseline model pins all libraries to a single snapshot. We are tightly coupled to a specific LLVM commit, so we could not use vcpkg for LLVM from the start. That alone meant vcpkg could only manage a subset of our dependencies. On top of that, when fmt bumped a major version and broke downstream consumers, it showed that the baseline approach is too rigid for projects that use a few unrelated libraries. Sometimes the entire baseline would be updated and libraries we had no reason to touch just got upgraded, introducing unexpected breakage. Different developers also had different baseline expectations, so the same vcpkg.json could produce different results depending on when someone last updated.
  • Missing dependencies: some dependencies were not in vcpkg at all, or not configured the way we needed them. LLVM is the classic example: we need a specific commit, built with specific flags. Tools do not provide their own vcpkg integration; everything is centralized in the vcpkg repository. This forced us into mixed-source dependency management where some deps come from vcpkg and some from custom scripts.
  • No variant support: when we needed sanitizer builds (ASan, MSan, UBSan, TSan), vcpkg had nothing to offer. It knows Debug and Release. Building sanitized variants required custom scripts or custom environment variables to pass the information to the package manager internally.
  • Manifest vs. classic mode: vcpkg offers two modes for specifying dependencies. Some users simply did not like one of the modes, and we had so many complaints that we ended up supporting both. Unlike npm’s local and global modes, vcpkg’s manifest and classic modes do not play well together, so supporting both effectively meant maintaining two separate dependency workflows.

The vcpkg team has done outstanding work on a genuinely difficult problem, and vcpkg handles a lot of it well. Many of these limitations may simply be the best anyone can do given the complexity of the language. Most of the problems listed above do have external solutions: you can set custom triplets, configure environment variables, pass flags manually, and configure build types from outside vcpkg. That is how we handled it for a long time. The issue is that those solutions live outside the vcpkg workflow. We owned that part, and maintaining it was hard. Having vcpkg in the equation meant one more workflow to support, even when the problem was not vcpkg’s fault. The accumulated complexity of maintaining vcpkg alongside our own custom scripts is what eventually became unsustainable.

The Problems No Package Manager Solves

  • Dependency acquisition at configure time: we once had FetchContent as an optional alternative to find_package, so CMake could download dependencies if they were not already present. A team member’s internet went down during a build and CMake failed. The reaction was strong: nobody should be required to have internet to compile a project they already downloaded. The feature was removed entirely. This reinforced that dependency acquisition needed to be a separate, explicit step that completes before the build system even runs.
  • IDE integration: developers had to manually configure run configurations for CLion, VS Code, or Visual Studio, and those configurations broke whenever the application changed, build options were added, or targets were renamed.
  • Platform-specific toolchain setup: on macOS with Homebrew Clang, the standard tool paths (llvm-ar, llvm-ranlib, ld.lld) are not where the system expects them. On Windows, MSVC requires a Developer Command Prompt with specific environment variables. Setting up either of these correctly from scratch is its own project.
  • Debugger integration: there was no automated way to set up LLDB formatters or GDB pretty printers for Clang and MrDocs symbols. Developers working on the AST had to inspect raw memory layouts.
  • The sheer volume of instructions: the build script should not assume a package manager, so you end up documenting both the manual and the package manager path. For each dependency, for each variant (sanitizers, special build types), for each platform. When the package manager path does not work for a given configuration, the developer falls back to the manual path, and that path has to be maintained too.

Five Workflows and Counting

The proliferation was gradual. We started with manual CMake commands, then added FetchContent as an alternative, then adopted vcpkg, then had to support both vcpkg modes, then needed custom CI scripts. By mid-2025, we had accumulated five different workflows for installing dependencies:

  1. Manual CMake: the original path, configuring everything by hand
  2. FetchContent: later removed after the internet incident
  3. vcpkg (manifest mode): the “official” package manager path
  4. vcpkg (classic mode): because some users did not like manifest mode
  5. Custom CI scripts: CI uses its own language to describe workflows, and there was no single command that could configure all possible build variants
%%{init: {"theme": "base", "themeVariables": {"primaryColor": "#fce4e4", "primaryBorderColor": "#e8a0a0", "primaryTextColor": "#1f2a44", "lineColor": "#e8a0a0", "secondaryColor": "#fef3e4", "tertiaryColor": "#ffffff", "fontSize": "14px"}}}%% flowchart LR A[New Developer] --> B{Which workflow?} B --> C[Manual CMake] B --> D[FetchContent] B --> E[vcpkg manifest] B --> F[vcpkg classic] B --> G[CI scripts]

We tried to create a set of instructions that would describe what the user could do for each dependency. For each dependency, we would explain each of the ways to fetch and build it: manual, vcpkg manifest, vcpkg classic. On top of that, for each special variant (sanitizer builds, special build type combinations), there would be yet another set of instructions per dependency per workflow. The documentation grew combinatorially, and people got lost.

The Bootstrap Script

The core principle was separation of concerns: CMake builds the project, but something else manages the dependencies. The bootstrap script fills that gap.

Before:

# Clone and build LLVM (specific commit)
git clone https://github.com/llvm/llvm-project.git
cd llvm-project && git checkout dc4cef81d47c...
cmake -S llvm -B build -DCMAKE_BUILD_TYPE=Release ...
cmake --build build
cmake --install build
cd ..

# Download and build Duktape
curl -L https://github.com/.../duktape-2.7.0.tar.xz | tar xJ
cmake -S duktape -B duktape/build ...
cmake --build duktape/build
cmake --install duktape/build

# Repeat for libxml2, Lua...
# Then configure MrDocs with all the install paths
cmake -S mrdocs -B mrdocs/build \
  -DLLVM_ROOT=/path/to/llvm/install \
  -Dduktape_ROOT=/path/to/duktape/install \
  -Dlibxml2_ROOT=/path/to/libxml2/install \
  ...
cmake --build mrdocs/build

After:

python bootstrap.py

The script handles everything else:

  1. Probes MSVC (Windows only): detects and imports the Visual Studio development environment
  2. Checks system prerequisites: validates that cmake, git, python, and a C/C++ compiler are available
  3. Sets up compilers: resolves compiler paths, detects Homebrew Clang on macOS
  4. Configures build options: prompts for build type, sanitizer, and preset name (or accepts defaults in non-interactive mode for CI)
  5. Probes compilers: runs a dummy CMake project to extract the compiler ID, version, and capabilities before building anything
  6. Sets up Ninja: finds or downloads the Ninja build system
  7. Installs dependencies: fetches and builds Duktape, Lua, libxml2, and LLVM in topological order, each with the correct flags for the chosen configuration
  8. Generates CMake presets: writes a CMakeUserPresets.json with all dependency paths, compiler configuration, and IDE settings
  9. Generates IDE configurations: run/debug configs for CLion, VS Code, and Visual Studio, plus debugger pretty printers
  10. Builds MrDocs: configures, builds, and optionally installs MrDocs using the generated presets
  11. Runs tests: executes the test suite in parallel
%%{init: {"theme": "base", "themeVariables": {"primaryColor": "#e4eee8", "primaryBorderColor": "#affbd6", "primaryTextColor": "#000000", "lineColor": "#baf9d9", "secondaryColor": "#f0eae4", "tertiaryColor": "#ebeaf4", "fontSize": "14px"}}}%% sequenceDiagram participant U as Developer participant B as bootstrap.py participant S as System participant D as Dependencies participant C as CMake participant I as IDE U->>B: python bootstrap.py B->>S: Probe MSVC environment (Windows) B->>S: Check prerequisites (cmake, git, compiler) B->>S: Set up compilers and Ninja B->>U: Prompt for build type, sanitizer, preset B->>S: Probe compiler ID and version B->>D: Fetch and build dependencies B->>C: Generate CMakeUserPresets.json B->>I: Generate IDE and debugger configs B->>C: Build and install MrDocs B->>C: Run tests

How It Evolved

The first commit landed on July 16, 2025. Over the next eight months, the script went through seven distinct phases of development across roughly 57 commits.

%%{init: {"theme": "base", "themeVariables": {"primaryColor": "#f7f9ff", "primaryBorderColor": "#9aa7e8", "primaryTextColor": "#1f2a44", "lineColor": "#b4bef2", "secondaryColor": "#fbf8ff", "tertiaryColor": "#ffffff", "fontSize": "14px"}}}%% timeline title bootstrap.py Evolution Jul 2025 : Foundation and UX Aug 2025 : IDE configs, sanitizers, and Windows Sep 2025 : Developer tooling and LLDB Dec 2025 : Modularization into package Mar 2026 : CI integration

The first week (July 16–19) was about getting the one-liner to work at all: the core workflow, colored prompts, parallel test execution, and the first installation docs.

Phase 1: Foundation (July 16–19, 2025)
  • 521cc704 build: bootstrap script
  • e32bb36e build: bootstrap uses another path for mrdocs source when not already called from source directory
  • e7e3ef51 build: bootstrap build options list valid types
  • 75c28e45 build: bootstrap prompts use colors
  • c156a05f build: bootstrap removes redundant flags
  • c14f071b build: bootstrap runs tests in parallel
  • 1a9de28c docs: one-liner installation instructions
  • 76611f93 build: bootstrap paths use cmake relative path shortcuts

The second and third weeks turned the script into a development environment setup tool by generating IDE run configurations for CLion, VS Code, and Visual Studio. By the end of July, the script also supported custom compilers, sanitizer builds, and Homebrew Clang on macOS.

Phase 2: IDE Integration (July 22–28, 2025)
  • 502cfbd8 build: bootstrap generates debug configurations
  • b546c260 build: bootstrap dependency refresh run configurations
  • 83525d38 build: bootstrap documentation run configurations
  • 2cfdd19e build: bootstrap website run configurations
  • ca4b04d3 build: bootstrap MrDocs self-reference run configuration
  • b5f53bd9 build: bootstrap XML lint run configurations
Phase 3: Build Variants and Sanitizers (July 29–August 1, 2025)
  • 0a751acd build: bootstrap supports custom compilers
  • ff62919f build: LLVM runtimes come from presets
  • 2b757fac build: bootstrap debug presets with release dependencies
  • 0d179e84 build: installation workflow uses Ninja for all projects
  • 3d8fa853 build: installation workflow supports sanitizers
  • 26cec9d8 build: installation workflow supports homebrew clang

August was the cross-platform month. Windows support required probing vcvarsall.bat, handling Visual Studio tool paths, and ensuring git symlinks worked. Paths were made relocatable so CMakeUserPresets.json files could be shared across machines.

Phase 4: Cross-Platform Polish (August 2025)
  • fc2aa2d6 build: external include directories are relocatable
  • 21c206b9 build: bootstrap vscode run configurations
  • d2f9c204 build: Visual Studio run configurations
  • 0ca523e7 build: bootstrap supports default Visual Studio tool paths on Windows
  • 4b79ef41 build(bootstrap): probe vcvarsall environment
  • 4d705c96 build(bootstrap): ensure git symlinks
  • 524e7923 build(bootstrap): visual studio run configurations and tasks
  • 94a5b799 build(bootstrap): remove dependency build directories after installation

September and October added developer tooling: LLDB data formatters for Clang and MrDocs symbols, pretty printer configurations, libcxx hardening mode, and the style guide documentation.

Phase 5: Developer Tooling (September–October 2025)
  • fc98559a build(bootstrap): include pretty printers configuration
  • 069bd8f4 feat(lldb): LLDB data formatters
  • 1b39fdd7 fix(lldb): clang ast formatters
  • 988e9ebc build(bootstrap): config info for docs
  • f48bbd2f build: bootstrap enables libcxx hardening mode
  • 5e16e3fa Fix support for clang cl-mode driver (#1069)

By December, the monolithic 2,700-line bootstrap.py was refactored into a proper Python package under util/bootstrap/ with 20+ modules organized by concern: core/ (platform detection, options, UI), configs/ (IDE run configurations), presets/ (CMake preset generation), recipes/ (dependency building), and tools/ (compiler detection). The package also includes its own test suite, which means one person changing the bootstrap script for their platform is not going to break it for someone else on a different platform.

Phase 6: Modularization (November–December 2025)
  • 0d4a8459 build(bootstrap): modularize recipes
  • 7ba4699b build(bootstrap): transition banner
  • 99d61207 build(bootstrap): handle empty input and “none” in prompt retry
  • e3b3fd02 build(bootstrap): convert script into package structure

In March 2026, the bootstrap script replaced the custom CI dependency scripts. This was a major milestone: users, developers, and CI now all use the same tool. CI was simplified significantly because the dependency steps are no longer custom shell commands maintained separately. And because CI runs the bootstrap on every push, the script itself is continuously tested across all platforms. If the bootstrap breaks on any platform, CI catches it immediately.

Phase 7: CI Integration (2026)
  • 6cee4af2 use system libs by default (#1077)
  • 9b4fafbf ci: dependency steps use bootstrap script

Key Design Decisions

Several technical challenges required careful design. Here are the most interesting ones.

Flag propagation. Not all flags should reach all dependencies, and the propagation rules vary per flag type and per dependency. Some sanitizers require all dependencies to be instrumented, while others only need compile-time checks. Build type does not always propagate (libxml2 is always built as Release). Compiler paths always propagate. The script evaluates each dependency individually and checks ABI compatibility before deciding whether to honor or coerce the build type.

Windows ABI handling. On MSVC, Debug and Release are ABI-incompatible at the CRT level. When the script detects a mismatch, it coerces the dependency build to “OptimizedDebug” (Debug ABI with /O2 optimization). This is different from RelWithDebInfo, which uses the Release ABI with debug symbols and will not link with a Debug MrDocs.

Cross-platform compiler detection. On Linux, compiler detection is straightforward. On macOS with Homebrew Clang, the script detects and injects the correct llvm-ar, llvm-ranlib, ld.lld, and libc++ paths, which are not on the default search path. On Windows, the script locates Visual Studio via vswhere.exe, runs vcvarsall.bat with debug output, and parses the environment variables into Python for all subsequent CMake calls.

CMake preset generation. After building dependencies, the script generates a CMakeUserPresets.json with all dependency paths, compiler configuration, and platform conditions. Paths are made relocatable by replacing absolute prefixes with CMake variables (${sourceDir}, ${sourceParentDir}, $env{HOME}).

IDE run configurations. The script generates ready-to-use configurations for CLion, VS Code, and Visual Studio: building and debugging MrDocs, running tests, generating documentation, refreshing dependencies, generating config info and YAML schemas, validating XML output, running MrDocs on Boost libraries (auto-discovered), and reformatting source files. CMake custom commands can create build targets, but you cannot debug them from the IDE.

Recipe system. Dependencies are defined as JSON recipe files with source URLs, build steps, and dependency relationships. The bootstrap topologically sorts them and builds them in order. Each recipe tracks its state with a stamp file (recipe version, git ref, platform, build parameters). If any parameter changes, the dependency is rebuilt. The stamp system also generates CI cache keys like llvm-abc1234-release-ubuntu-24.04-clang-19-ASan.

Refresh command. Because of the stamp system, a developer can run the bootstrap with --refresh-all at any time. The script re-evaluates all stamps and rebuilds only the dependencies that are out of date with whatever configurations are needed. This makes updating dependencies after a configuration change (new sanitizer, different compiler, updated LLVM commit) a single command rather than a manual process of figuring out which dependencies need rebuilding.

What We Learned

Users, developers, and CI now all use the same tool. Users get a one-liner installation. Developers get IDE run configurations and debugger integration. CI gets non-interactive mode with sanitizer support. The exact same code path that builds dependencies on a developer’s laptop now builds dependencies in CI.

Separation of concerns. When your project’s requirements are complex enough (multiple build types, sanitizer variants, cross-platform quirks, heavy dependencies like LLVM), a custom script that owns the entire dependency lifecycle is simpler than trying to make a general-purpose tool handle every edge case.

Existing tools solve the general case well. Our specific combination of requirements needed something tailored.

C++ has no unified build workflow. Every platform has its own conventions for finding compilers, setting up environments, and linking libraries. Just finding and setting up MSVC from a script is a project in itself.

New contributors can start working immediately. Before the bootstrap, getting a working build could take days. Now it takes a single command, and the IDE configurations are included.

We still have small glitches as new compilers and platforms appear, but each fix is a localized change in one module rather than a cross-cutting update to five independent workflows.

The complete bootstrap package is available in the MrDocs repository.

All Posts by This Author