Provenance Boundary Failure — Prompt Assembly Diagram

Overview

This reference page documents a provenance boundary failure in prompt assembly: a failure mode where untrusted input is misclassified and handled as if it were authoritative policy. The diagram focuses on the orchestration layer that builds context for an LLM and on how misclassification can propagate into behavior and cross-turn state.

Diagram

Provenance Boundary Failure — Prompt Assembly

System boundary modeled

The diagram models an orchestration boundary (gateway + context builder) that receives:

Components

Trust and provenance model

This diagram assumes two distinct classes of inputs:

A robust prompt-assembly pipeline must preserve this distinction through explicit provenance metadata attached to every context item (source + trust).

Failure mode: provenance boundary failure

A provenance boundary failure occurs when either:

and untrusted input is treated as if it were authoritative policy. The diagram marks this as “Provenance break” and shows it propagating into assistant behavior.

Cross-turn propagation (drift)

The diagram also models a cross-turn risk: assistant behavior may persist state into session/memory, and that state can be reintroduced into subsequent context builds. If misclassification occurs, this can create cross-turn privilege drift where the effect of a prior misclassification persists across turns.

Security impacts shown in the diagram

Engineering controls derived from this model

Scope and limitations

This diagram is an author-mapped conceptual model of a prompt-assembly pipeline and its failure mode. It is intended for engineering analysis and control design, not as vendor documentation of internal implementations.

Source files (repository paths)

References (formal identifiers)