← back to temporal compression

The Open Problems

Chapter 4 Β· Navlakha et al. 2008 Β· Seely 2025

R-D bounds for journey observables. Checkpoint spacing via sheaf cohomology.

Problem 1: Rate-distortion bounds for journey observables

The question: Given a bit budget R, what is the minimum distortion in journey metrics (reachability, duration, hop count) for a temporal graph?

What exists: Navlakha et al. (2008) proposed MDL-based graph summarization with bounded per-node reconstruction error for static graphs. This is MDL, not Shannon rate-distortion theory β€” no R-D tradeoff curve, no converse bounds. Adhikari et al. (2017) and Allen et al. (2024) compress temporal networks preserving dynamics heuristically. None provides information-theoretic bounds where distortion is measured on temporal journeys.

What it requires:

R (bits) D (journey distortion) static graphs (Navlakha, MDL) temporal journeys ? unknown bound Rate-Distortion for Temporal Graphs

Problem 2: Checkpoint spacing via sheaf cohomology

The question: On a directed dependency graph with per-edge prediction error, when are checkpoints necessary, and how should they be spaced?

The sheaf formulation: Seely (2025) shows that on a cellular sheaf, the coboundary δ⁰ produces edgewise prediction errors and the Hodge decomposition splits them into two components:

The scalar additive error law Ξ΅(e₁) βŠ• ... βŠ• Ξ΅(eβ‚–) may be the wrong abstraction. The right quantity could be the harmonic component of accumulated error β€” the part that no encoder choice can remove. Checkpoints would then correspond to clamped boundary conditions on a subgraph that suppress or localize the harmonic component.

Seely's framework gives exact definitions for every term. The open question is whether checkpoint placement can be formulated as a constraint on a relative coboundary problem, and whether spacing bounds follow from the cohomological structure.

The linear restriction aligns with codec prediction (motion-compensated subtraction is linear). Nonlinear extension requires Jacobian linearization.

Toy model: empirical compression curve

Compress a temporal graph by dropping edges, measure journey distortion. This is greedy edge-dropping, not a Shannon R-D bound β€” but the shape of the curve is informative:

Python

What the toy model shows

Diminishing returns as you add edges, and a critical threshold below which reachability collapses. The open problem asks for the information-theoretic minimum: for a class of temporal graphs, how close can any compression get to the frontier?

Source material: jk Temporal Compression (blog post)
Neighbors

Key references

Related sections