This guide answers a simple question:
If you want to use T81 today, what should you run?
It is not a full architecture overview. It is a practical operator path for the current real surfaces in the repo.
Use T81 today as three practical systems:
If you understand those three, you understand the strongest current usable value in the repo.
T81 is a governed execution pipeline where:
The system is not centered on model outputs.
It is centered on:
Typical systems:
T81:
This shifts the system from:
to:
Use this when you want to:
Start here:
Core commands:
./build/t81 canonfs import <path> --canonfs-root <root> --json
./build/t81 canonfs export <ref> --canonfs-root <root> --out <path> --json
What you get back:
statuspolicy_resultpolicy_profileprovenance_refmanifest_refUse this surface when the main question is:
Use this when you want AI execution to end as a canonical object chain instead of a loose model output.
Current admitted family:
assess-fixedroute-fixedclassify-fixedStart here:
Run the current examples:
bash examples/ai-and-inference/model-load-canonfs/run_assess_fixed_host_action.sh
bash examples/ai-and-inference/model-load-canonfs/run_route_fixed_path_selection.sh
bash examples/ai-and-inference/model-load-canonfs/run_classify_fixed_rule_selection.sh
What each chain does:
Use this surface when the main question is:
Use this when you want to start from the final object and consume the chain safely.
The bundle is the system boundary.
It is the only object required to:
Consumers should not reconstruct execution from logs or intermediate state. They should begin from the bundle and follow canonical references.
Start here:
Run the current consumer examples:
bash examples/ai-and-inference/model-load-canonfs/run_assess_fixed_bundle_consumer.sh
bash examples/ai-and-inference/model-load-canonfs/run_route_fixed_bundle_consumer.sh
bash examples/ai-and-inference/model-load-canonfs/run_classify_fixed_bundle_consumer.sh
If you already have a bundle_ref and want a small normalized projection
instead of a full bundle artifact dump, run:
bash examples/ai-and-inference/model-load-canonfs/summarize_ai_bundle.sh \
"<bundle_ref>" \
"<canonfs_root>"
What these prove:
bundle_refrecord_ref and action_refUse this surface when the main question is:
If you want the shortest useful path, do this in order:
run_assess_fixed_host_action.shrun_assess_fixed_bundle_consumer.shThat sequence teaches:
examples/storage-and-canonfs/canonfs-interchange/examples/ai-and-inference/model-load-canonfs/*_bundle_consumer.sh scriptsexamples/ai-and-inference/model-load-canonfs/summarize_ai_bundle.shUse T81 today as a system that:
That is the clearest current truth of the repo.