Python SDK
Audience: workflow builders
Status: Preview
The Python SDK is the main intended builder surface for Ophiolite.
It is designed to expose Ophiolite nouns such as Project and to keep lower-level transport, platform-admin, and operator-authoring details in explicit advanced namespaces instead of mirroring internal Rust modules directly.
It is also narrower than the full repo. The Python SDK is a platform surface, not a mirror of TraceBoost desktop commands or app-local workflow orchestration.
Stable public vocabulary
Section titled “Stable public vocabulary”The top-level package should teach durable subsurface and workflow language first:
ProjectSurveyWellWellboreWellboreBindingSectionSelectionTraceLocalPipelineSubvolumePipelineGatherPipelinePostStackNeighborhoodPipelineVelocityScanSpecBandpassFilterRmsAgcavo_reflectivity(...)rock_physics_attribute(...)avo_intercept_gradient_attribute(...)
The main rule is:
- domain nouns and typed workflow helpers belong in
ophiolite_sdk - raw transport shapes, platform internals, and extension plumbing do not
Advanced namespaces:
ophiolite_sdk.analysisophiolite_sdk.avoophiolite_sdk.operatorsophiolite_sdk.platformophiolite_sdk.interop
The rule for those advanced namespaces is:
analysis,avo,operators, andplatformremain platform-owned expert surfacesinteropis an explicit compatibility and transport lane, not the primary teaching surface- app-local TraceBoost transport names should not be copied into the Python SDK as public API
Operator exposure
Section titled “Operator exposure”Built-in operators should usually be exposed through typed workflow surfaces instead of generic request objects.
For logs and AVO:
wellbore.elastic_log_set(bindings=...)elastic.run_avo(layering=..., experiment=...)result.response_source(...)result.crossplot_source(...)
For seismic processing:
project.surveys()survey.operator_catalog()TraceLocalPipeline.named(...).bandpass(...).agc_rms(...)survey.preview_processing(selection, pipeline)survey.run_processing(pipeline, output_collection_name=...)
For external Python operator authoring:
ophiolite_sdk.operators.OperatorRegistryophiolite_sdk.operators.OperatorRequestophiolite_sdk.operators.computed_curve(...)
This keeps built-in workflow composition and external operator authoring visible without collapsing them into one generic API shape.
What it is good for
Section titled “What it is good for”- local project lifecycle
- survey-backed seismic discovery
- project-owned seismic execution
- well and wellbore navigation
- typed built-in operator composition
- operator package installation
- compute catalog lookup
- compute execution
- Python operator authoring
What it is not yet
Section titled “What it is not yet”- a full mirror of every Rust capability
- a cloud API client
- a place to expose storage internals as the primary abstraction
Namespace guidance
Section titled “Namespace guidance”- use
ophiolite_sdkfor the main builder story - use
Well,Wellbore, andSurveyfor the main object graph and convenience helpers - use
WellboreBindingwhen a later ingest needs to attach explicitly to an existing canonical wellbore - use
Surveyplus typed seismic workflow objects when the data already lives in anOphioliteProject - use
TraceBoostAppandSeismicDatasetonly when the workflow is intentionally loose-store and outside project asset ownership - use
project.viewswhen you want explicit project-scoped well-panel, survey-map, and section-overlay resolution - use
Wellbore.log_curves()when you want the current well-log curves resolved through the project layer - use
Wellbore.elastic_log_set(bindings=...)when you want semantic elastic inputs for log-derived workflows such as AVO - use
ophiolite_sdk.avofor domain-first AVO spec objects such asElasticChannelBindings,LayeringSpec,AngleSampling, andAvoExperiment - use
ophiolite_sdk.analysisfor explicit kernel-style request/response APIs - use
ophiolite_sdk.operatorsfor Python operator authoring helpers - use
ophiolite_sdk.platformfor platform/admin introspection such as the operation catalog - use
ophiolite_sdk.interopfor raw typed transport models and compatibility-facing shapes
Public versus compatibility lanes
Section titled “Public versus compatibility lanes”Teach the Python SDK in this order:
- domain-first object graph and typed workflow helpers first
- explicit advanced platform namespaces second
- compatibility lanes only when the workflow genuinely needs them
The current compatibility lanes are:
ophiolite_sdk.interopfor raw transport-shaped modelsTraceBoostAppandSeismicDatasetfor intentionally loose-store workflows outside project ownership
Those are valid surfaces, but they are not the main public promise. They should stay clearly separated from the domain-first ophiolite_sdk story.
Object-first workflow shape
Section titled “Object-first workflow shape”The intended builder flow is:
Projectopens or creates the local workspaceProject.import_las(...)can seed canonical log assets directly, withbinding=wellbore.binding()when vendor headers need explicit wellbore attachmentProject.wells()andProject.surveys()discover the main domain objectsWell.wellbores()andWell.surveys()continue the project graph without dropping back to raw idsSurvey.operator_catalog()discovers the available seismic operator families for that project-owned assetSurvey.preview_processing(...),Survey.run_processing(...),Survey.preview_subvolume(...),Survey.run_subvolume(...),Survey.preview_gather(...),Survey.run_gather(...),Survey.preview_post_stack_neighborhood(...), andSurvey.velocity_scan(...)keep seismic execution on canonical asset ids instead of raw store pathsWell.panel(),Wellbore.panel(),Wellbore.trajectory(),Survey.map_view(), andSurvey.section_well_overlays()delegate to the same Rust-owned view resolvers exposed underproject.viewsWellbore.log_curves()andWellbore.elastic_log_set(bindings=...)keep log and AVO preparation object-first without exposing storage internals as the primary abstractionelastic.run_avo(layering=..., experiment=...)applies domain-first AVO spec objects instead of requiring raw request payloads in the main builder story
This keeps the public story domain-first while still leaving lower-level interop, admin, and authoring shapes available in explicit advanced namespaces.
Log and AVO workflow notes
Section titled “Log and AVO workflow notes”Wellbore.elastic_log_set() is semantic rather than LAS-specific.
The helper resolves the best current elastic inputs in this order:
PVelocity, otherwiseSonic -> VpSVelocity, otherwiseShearSonic -> VsBulkDensity
For interval-based AVO workflows, the intended progression is:
Wellbore.elastic_log_set(bindings=ElasticChannelBindings(...))Wellbore.top_set(...)when the workflow should follow authored intervalsLayeringSpec.fixed_interval(...)orLayeringSpec.from_edges(...)orLayeringSpec.from_top_set(...)ortop_set.layering(...)AvoExperiment.zoeppritz(...)elastic.run_avo(layering=..., experiment=...)result.response_source(...)
Users are not limited to fixed bin lengths. The domain-first AVO surface supports:
- fixed intervals with
LayeringSpec.fixed_interval(...) - explicit depth edges with
LayeringSpec.from_edges([...]) - named top-set intervals with
LayeringSpec.from_top_set(...) - exact interval-set selectors with
top_set.layering(selectors=[...])
The stable log-type and interval-set story is meant to stay domain-first:
from ophiolite_sdk.avo import ( AngleSampling, AvoExperiment, ElasticChannelBindings, LayeringSpec,)
elastic = wellbore.elastic_log_set( bindings=ElasticChannelBindings(vp="Dt", vs="Dts", density="Rho"))
top_set = wellbore.top_set("lithostrat-tops") or wellbore.top_set()layering = ( top_set.layering(selectors=top_set.interval_selectors[:2]) if top_set is not None and len(top_set.interval_selectors) >= 2 else LayeringSpec.fixed_interval(20, unit="ft"))
result = elastic.run_avo( layering=layering, experiment=AvoExperiment.zoeppritz( angles=AngleSampling.range(0, 40, 5) ),)That lets the public API read in canonical subsurface nouns even when the source LAS carried vendor mnemonics such as DTCO, DTSM, and BDCX, or when imported tops contain repeated labels that need exact selectors such as NLLFC#1.
The lower-level layering helpers remain available for advanced users, but the main SDK story should now prefer the ophiolite_sdk.avo spec objects.
That keeps LAS ingest canonical and reusable. A sonic curve can remain the source of truth while a later workflow chooses whether to keep Vp and Vs virtual or materialize derived direct curves with elastic.materialize_missing_channels(...).
Seismic operator workflow notes
Section titled “Seismic operator workflow notes”The same domain-first rule applies to seismic processing. For project-owned seismic assets, the public story should read as typed surveys and typed operators, not raw command plumbing:
from ophiolite_sdk import Project, SectionSelection, TraceLocalPipeline
project = Project.open("demo-project")survey = project.surveys()[0]selection = SectionSelection.inline(120)pipeline = ( TraceLocalPipeline.named( "Bandpass + RMS AGC", description="Trace-local seismic golden path.", ) .bandpass(8.0, 12.0, 45.0, 60.0) .agc_rms(40.0))
catalog = survey.operator_catalog()preview = survey.preview_processing(selection, pipeline)processed = survey.run_processing( pipeline, output_collection_name="derived-seismic",)That keeps discovery and execution on project-owned asset ids while still exposing operators as explicit workflow objects.
If the workflow is intentionally outside project ownership, TraceBoostApp and SeismicDataset remain available as the loose-store compatibility lane for direct .tbvol and .tbgath work.
Relationship to the CLI
Section titled “Relationship to the CLI”The Python SDK and CLI should expose the same platform meanings. The SDK is the preferred builder surface. The CLI remains useful for scripting, CI, and operational tasks.
Neither surface should be treated as a wrapper over TraceBoost desktop commands. All three may reach the same Rust-owned behavior, but the desktop command boundary remains app-local.
Deprecations
Section titled “Deprecations”Some preview aliases still exist for compatibility, but the docs should always teach the preferred current names first.
The public placeholder for that migration ledger is Python SDK deprecations.