Custom Workloads
Instead of using a preset, you can define arbitrary step DAGs withcustom_job in your plan request. The planner handles placement, transfer insertion, and scheduling the same way it does for presets.
Request Structure
Sendcustom_job instead of preset_id in POST /v1/plan:
Step Schema
Each step in thesteps array:
Unique identifier within the job. Used in
depends_on references.Human-readable step name.
Where the step runs:
onboard, ground, or either. When set to either, the planner decides based on data reduction ratio and transfer cost.Execution duration in seconds. Must be positive.
IDs of prerequisite steps. Use
[] for steps with no dependencies.Resource requirements:
power_w— Power consumption in wattscompute— Fraction of compute capacity (0.0–1.0)thermal_w— Thermal dissipation in wattsmemory_mb— Memory required in MBstorage_mb— Storage required in MB
Input data size in MB.
Output data size in MB.
Output/input ratio.
0.1 means 10:1 reduction. Set to null for data-generating steps.Checkpoint frequency in seconds.
0 disables checkpointing.fail, retry_next_window, or retry_immediate.Maximum retry attempts.
none, aes128, or aes256. Adds data expansion overhead.none, crc32, or sha256.Fault tolerance. If set, provide
min_data_fraction (0.0–1.0) and reduced_duration_s.Security Overrides
Optionalsecurity object at the job level:
restricted classification, authenticated uplink, key rotation every 24 orbits.
Policy
Optionalpolicy object:
| Field | Options | Default |
|---|---|---|
objective | min_latency, min_energy, max_reliability, balanced | balanced |
deadline_orbits | Number of orbital periods | 6 |
max_data_loss_fraction | 0.0–1.0 | 0.01 (1%) |
min_delivery_confidence | 0.0–1.0 | 0.95 (95%) |
Validation Rules
- Every step must have a unique
id depends_onreferences must point to existing step IDs- No circular dependencies (validated via DFS)
locationmust beonboard,ground, oreitherduration_smust be positiverequiresmust include all 5 resource fields
400 with a validation_error describing the issue.
Example
A 2-step pipeline: capture sensor data on-board, then process on the ground.The planner automatically inserts transfer steps (downlink/uplink) at space-ground boundaries. Your 2-step job may produce a plan with 3+ segments.

