A governance system that produces different outputs for the same inputs is not a governance system. It is a probabilistic approximation of one. This distinction matters. Approximations are useful for many things. They are not useful for governance. Governance requires the ability to reconstruct, verify, and independently confirm that a specific decision was made by specific rules from specific inputs - not approximately, not statistically, but exactly. Determinism is not a feature of governance infrastructure. It is the definition.Documentation Index
Fetch the complete documentation index at: https://docs.manthan.systems/llms.txt
Use this file to discover all available pages before exploring further.
What Determinism Means
Determinism in a governance system has a precise meaning: Same inputs -> same canonical bytes -> same fingerprint. The execution fingerprint -SHA-256(canonicalize(signals)) - is a function of the input signals only. Given the same signals, the fingerprint is the same on any machine, in any environment, at any time. Not approximately the same. Bit-for-bit identical.
Same fingerprint -> same decision (given same policy).
Policy evaluation is a deterministic function: a set of ordered rules evaluated against a set of signals. There is no randomness, no model variance, no context window. Given the same signals and the same policy version, the evaluation always produces the same decision. No exceptions.
Same decision -> same signature (given same key and algorithm).
Ed25519 is a deterministic signing algorithm. Unlike ECDSA (which requires a random nonce per signature and is vulnerable to nonce reuse), Ed25519 signatures are computed deterministically from the private key and the message. The same private key signing the same payload always produces the same signature. This is a cryptographic property of the algorithm, not an implementation choice.
Result: any party can reproduce the entire chain from inputs.
Given the signals, the policy, the public key, and the attestation, any party can:
- Compute the canonical form of the signals
- Compute the fingerprint
- Evaluate the policy
- Construct the canonical payload
- Verify the Ed25519 signature
Why Non-Determinism Breaks Governance
The Verification Problem
Consider a decision signed today. Can you verify it in five years? Only if the signing inputs are reproducible. If anything non-deterministic entered the signed payload - a wall-clock timestamp, a random identifier, a non-normalized string that normalizes differently on different platforms - then the canonical payload computed today differs from the canonical payload computed at signing time. The signature fails to verify. The governance record is unverifiable. This is not a theoretical concern. It is the primary failure mode of naive attestation implementations. Wall-clock timestamps. Includingnew Date().toISOString() in a signed payload makes the payload non-reproducible. The timestamp changes every millisecond. The attestation signed at time T cannot be reconstructed at time T+1 because the timestamp has changed. The signature is unverifiable immediately after signing.
The correct approach: exclude timestamps from signed content. The approved_at field in a Parmana attestation is metadata - recorded for observability, but excluded from the canonical payload that the signature covers. The signed content contains only deterministic fields: the execution fingerprint, policy reference, decision, and runtime hash.
Non-canonical JSON. JSON key ordering is implementation-defined. {"b":1,"a":2} and {"a":2,"b":1} are the same object by JSON semantics, but they are different byte sequences. Different byte sequences produce different SHA-256 hashes and different Ed25519 signatures.
If a signing implementation serializes JSON without enforcing key order, the same object can produce different signatures on different platforms, or even on different runs of the same platform if the language runtime’s hash table order changes (which it does, in V8, per security hardening). The signature is unreliable.
The correct approach: canonical JSON with sorted keys, computed by a single authoritative function. One implementation. No alternatives.
Unicode normalization. The string “cafe” can be represented as a single precomposed character (U+00E9) or as “e” followed by a combining acute accent (U+0301). They look identical. They are different byte sequences. Different byte sequences produce different fingerprints.
If signals are not normalized before hashing, the same semantic value produces different fingerprints depending on which Unicode normalization form was used by the producer. A signal produced on macOS (which normalizes to NFD) has a different fingerprint than the same signal on Linux (which may use NFC or no normalization). Governance is platform-dependent. Verification fails across systems.
The correct approach: NFC normalization of all string values before canonicalization. Single normalization form. Applied before hashing, not after.
Line ending normalization. Windows uses CRLF (\r\n). Unix uses LF (\n). The same text file produced on Windows has different bytes than on Unix. If signal values contain multiline strings - policy text, reasons, structured content - the line endings affect the fingerprint.
The correct approach: normalize all string values to LF before canonicalization. Applied before hashing. Consistent regardless of origin platform.
The Audit Problem
An auditor - a regulator, a compliance officer, a counterparty - wants to reconstruct: “Given signals X and policy version P, what decision should have been made?” If the runtime is deterministic, this reconstruction is possible. The auditor runs the policy evaluator with the same signals and the same policy version, produces the same canonical payload, and verifies the signature. The result is either: the decision matches the attestation (governance was correctly applied) or it does not (something went wrong). If the runtime is non-deterministic - if the decision could be different for the same inputs in different environments - then reconstruction fails. The auditor cannot determine what the “correct” decision would have been. The governance claim cannot be verified. The audit is impossible. This is the fundamental audit problem with probabilistic systems. A model that produces a decision with 73% confidence cannot be audited in this way. You cannot independently verify that the same inputs would have produced the same decision. The model is an oracle, not a function. Governance requires a function.The Portability Problem
A decision made in a datacenter in Frankfurt must be verifiable by a regulator in Washington. A decision made in 2024 must be verifiable in 2034. A decision made by a system that has since been decommissioned must be verifiable by a court. Portability requires that verification can be performed by any party, in any environment, at any time, without access to the original system. This is only possible if the verification is a pure function of the attestation and the public key - with no dependencies on runtime state, database contents, or infrastructure availability. Determinism is the prerequisite for portability. If the signed payload is non-deterministic, the verification function cannot reconstruct it from the attestation fields. If the policy evaluation is non-deterministic, an auditor cannot confirm that the decision was correctly derived from the inputs. If the signature algorithm is non-deterministic (like ECDSA with a random nonce), the signature may not be reproducible even with the correct inputs. Every non-deterministic element in the governance pipeline is an obstacle to portability. Determinism eliminates the obstacles.The Enemies of Determinism
Determinism has precise enemies. Each one is a specific, identifiable problem with a specific fix.Wall-Clock Time in Signed Payload
Problem.new Date(), Date.now(), performance.now() - any time-dependent value in the signed payload makes the payload non-reproducible.
Fix. Exclude timestamps from signed content. Record them as metadata. The canonical payload contains only content that is a deterministic function of the governance inputs.
Non-Canonical JSON
Problem. JSON.stringify key order is implementation-defined. Object property insertion order may differ. Hash table iteration order may differ. The same object may serialize differently on different runs or platforms. Fix. Enforce sorted keys in the canonicalization function. One function, one specification, one implementation. No bypass.Unicode Non-Normalization
Problem. String equality in Unicode is not byte equality. The same semantic string may have multiple binary representations. Platform-specific normalization means the same string produces different bytes on different systems. Fix. Apply NFC normalization to all string values before canonicalization. This is a normalization that preserves semantic content while ensuring byte-level reproducibility.Line Ending Variance
Problem. Windows CRLF and Unix LF are different bytes. The same content, produced on different platforms, has different byte representations. Fix. Normalize all string values to LF before canonicalization. This applies to multiline strings in any field position.Non-Deterministic Identifiers in Signed Content
Problem. UUIDv4 is a random identifier. If a UUID is included in the signed payload, the same execution has a different UUID every time, making the payload non-reproducible. Fix. Distinguish between the event identity (executionId - random UUID, excluded from signed payload) and the decision identity (execution_fingerprint - SHA-256 of canonical signals, included in signed payload). Include only deterministic identifiers in signed content.Non-Deterministic Signing Algorithms
Problem. ECDSA requires a random nonce per signature. The same payload signed twice with ECDSA produces two different signatures. The signature is non-deterministic. Nonce reuse in ECDSA is a critical vulnerability (it leaks the private key). Non-determinism in the signature itself complicates verification and audit. Fix. Use Ed25519. Ed25519 is deterministic by construction - the nonce is derived from the private key and the message using a pseudorandom function, not from external randomness. Same key, same payload, same signature, every time.How Parmana Enforces Determinism
Single Canonical Authority
The@parmanasystems/bundle package provides a single canonicalize() function. Every signing path in the Parmana SDK delegates to this function. There are no parallel implementations. There is no “fast path” that skips canonicalization. The canonical form is specified exactly, implemented once, and tested extensively.
The Canonical Form Specification
The canonical form is not implicit - it is a specified contract:- Key ordering - Object keys sorted lexicographically by UTF-8 byte value, applied recursively to all nested objects
- Compact encoding - No whitespace between tokens (no spaces after
:or,) - NFC normalization - All string values normalized to Unicode NFC form before serialization
- CRLF normalization - All string values with CRLF sequences normalized to LF before serialization
- UTF-8 encoding - Output is UTF-8 encoded
- No floating-point ambiguity - Signal values that are integers are serialized as integers, not floats
Conformance Tests
The correctness of the canonical form is tested with:- Reproducibility tests - Same inputs produce identical bytes across multiple runs and across restarts
- Cross-platform tests - Same inputs on Windows and Linux produce the same fingerprint
- Unicode normalization tests - Strings in NFC, NFD, NFKC, NFKD forms all produce the same canonical bytes
- CRLF normalization tests - Strings with CRLF produce the same canonical bytes as equivalent LF strings
- Key ordering tests - Objects with different insertion orders produce the same canonical bytes
- Nested object tests - Recursively nested objects are sorted at every level
The Result
A governance system built on deterministic canonical form has the following properties: A decision signed today is verifiable in fifty years. The canonical form is stable. Ed25519 is a stable algorithm. No infrastructure dependency. A decision made on Windows is verifiable on Linux. The canonical form normalizes line endings and Unicode. Platform variance is eliminated before hashing. A decision made in London is verifiable in Singapore. Verification requires only the attestation and the public key. No geographic infrastructure dependency. A decision made by a decommissioned system is verifiable after the system no longer exists. Self-contained attestations. No runtime required for verification. A decision contested in court can be independently verified by a technical expert. The verification is a pure function with a published specification. Any competent cryptographer can verify it. This is what governance means. Not a dashboard. Not a log. A cryptographic proof that a specific decision was made by specific rules from specific inputs, verifiable by any party, at any time, without access to the infrastructure that produced it. Determinism is not a feature. It is the definition.See Also
- Canonical Serialization - the canonical form specification in detail
- Immutable Lineage - how determinism creates an immutable audit chain
- Portable Verification - the verification model
- Trust Portability - properties required for portable trust
- Replay Protection Deep Dive - determinism in the context of exactly-once execution