Not every computation succeeds. A partial Markov category extends Fritz's Markov categories with restriction: morphisms can fail, observations can have zero probability, and Bayes' theorem handles the conditioning. The framework unifies Pearl's conditioning with Jeffrey's updating.
Restriction — morphisms that can fail
A restriction category (nLab) gives each morphism a "domain of definition," a partial identity marking where the morphism is defined. In probabilistic terms: some observations have zero probability, and conditioning on them is undefined.
Scheme
; A partial function: defined on some inputs, undefined on others; Restriction: the domain where the function is defined
(define (partial-div x y)
(if (= y 0) 'undefined (/ x y)))
(define (restriction f)
; Returns a predicate: where is f defined?
(lambda (args)
(not (eq? (apply f args) 'undefined))))
(define div-defined? (restriction partial-div))
(display "div(6,3) = ") (display (partial-div 63)) (newline)
(display "div(6,0) = ") (display (partial-div 60)) (newline)
(display "defined at (6,3)? ") (display (div-defined? '(63))) (newline)
(display "defined at (6,0)? ") (display (div-defined? '(60)))
Python
# Partial functions and restrictiondef partial_div(x, y):
if y == 0:
returnNone# undefinedreturn x / y
def restriction(f):
returnlambda *args: f(*args) isnotNone
div_ok = restriction(partial_div)
print(f"div(6,3) = {partial_div(6,3)}, defined: {div_ok(6,3)}")
print(f"div(6,0) = {partial_div(6,0)}, defined: {div_ok(6,0)}")
Observations — conditioning on events
An observation is a partial morphism that "tests" whether an event occurred. If the event has zero probability, the observation is undefined. You can't condition on something that can't happen. Categorically, this is "divide by zero in Bayes' rule."
Scheme
; Observation: condition a distribution on an event; If P(event) = 0, conditioning is undefined
(define (condition dist event)
; dist: alist of (value . probability); event: predicate on values
(let* ((matching (filter (lambda (p) (event (car p))) dist))
(total (apply + (map cdr matching))))
(if (= total 0)
'undefined ; can't condition on impossible event
(map (lambda (p)
(cons (car p) (/ (cdr p) total)))
matching))))
(define dist '((1 . 0.3) (2 . 0.5) (3 . 0.2)))
(display "P(even): ")
(display (condition dist even?)) (newline)
(display "P(> 10): ")
(display (condition dist (lambda (x) (> x 10)))) (newline)
; Conditioning on impossible event is undefined
Python
# Conditioning on eventsdef condition(dist, event):
matching = {k: v for k, v in dist.items() if event(k)}
total = sum(matching.values())
if total == 0:
returnNone# undefined — zero-probability eventreturn {k: v / total for k, v in matching.items()}
dist = {1: 0.3, 2: 0.5, 3: 0.2}
print(f"P(even): {condition(dist, lambda x: x % 2 == 0)}")
print(f"P(>10): {condition(dist, lambda x: x > 10)}")
Bayes' theorem in the partial setting
In the partial setting, Bayes' theorem says the posterior exists exactly when the marginal likelihood is nonzero. The restriction structure tracks this automatically — no ad hoc division-by-zero guards needed.
Scheme
; Bayes' theorem: partial when marginal likelihood = 0
(define (bayes prior likelihood evidence)
(let* ((joint (map (lambda (hp)
(let ((h (car hp)) (ph (cdr hp)))
(let ((pe (assoc evidence (likelihood h))))
(if pe
(cons h (* ph (cdr pe)))
(cons h 0)))))
prior))
(total (apply + (map cdr joint))))
(if (= total 0)
'undefined ; restriction: evidence has zero probability
(map (lambda (jp)
(cons (car jp) (/ (cdr jp) total))) joint))))
(define prior '((A . 0.4) (B . 0.6)))
(define (likelihood h)
(if (eq? h 'A) '((yes . 0.8) (no . 0.2))
'((yes . 0.3) (no . 0.7))))
(display "P(H|yes): ") (display (bayes prior likelihood 'yes)) (newline)
(display "P(H|no): ") (display (bayes prior likelihood 'no)) (newline)
; Impossible evidence
(define (likelihood2 h) '((yes . 0.0) (no . 1.0)))
(display "P(H|yes) with zero likelihood: ")
(display (bayes prior likelihood2 'yes))
Pearl vs Jeffrey updating
Pearl conditioning sets evidence to a certainty (hard observation). Jeffrey updating sets evidence to a distribution (soft observation). Both live in the same categorical structure: Jeffrey generalizes Pearl by allowing partial observations.
The examples use finite distributions with explicit zero-probability checks. The paper works with restriction categories and subdistribution monads, where partiality is built into the categorical structure via restriction idempotents. For example, the conditioning function on this page returns 'undefined when the event has zero probability. In the paper, that becomes a restriction idempotent, a morphism e with e;e = e that acts as a partial identity; the zero-probability case is where the restriction is the zero morphism. The division-by-zero guard is the same; the categorical abstraction is not.
Every example is Simplified.
Ready for the real thing? Read the paper. Start at §3 for partial Markov categories, §5 for observations and Bayes' theorem, §7 for Pearl and Jeffrey updates.