Skip to main content

Inference

torchpipe.pipe

Initialization

The initialization interface for torchpipe.pipe is:
import torchpipe as tp
models = tp.pipe(config: Union[Dict[str, str] | Dict[str, Dict[str, str]] | str])
class torchpipe.pipe(config: Dict[str, str])
Parameters
  • config: Dict[str, str]: Configuration parameters will be passed to the backend for parsing. Specific backends may perform parameter expansion on the configuration.
Example
config = {"backend":"DecodeMat"};

Inference

The forward interface for torchpipe.pipe is:
class torchpipe.pipe
def __call__(self, data: Dict[str, Any] | List[Dict[str, Any]]) -> None

Thread-safe forward computation

def __call__(self, data: Dict[str, Any]) -> None
Parameters
  • "data": Any: Required input, the calculation backend needs to retrieve data from this key and then perform parsing.
  • "result": Any: Used for output, when it does not exist, it means there is no calculation result.
  • "node_name": str: When there are multiple root nodes, it is used to specify the node name.
  • Other key values (except for system reserved key values) can be used as input or output, which is determined by the backend.

Reserved Key Values in the System

KeyDefinitionRemarks
TASK_DATA_KEYdataOne of the input key values
TASK_RESULT_KEYresultOne of the output key values
TASK_CONTEXT_KEYcontextSyntax sugar for global sharing
TASK_EVENT_KEYevent
"_*"All strings starting with an underscore
TASK_NODE_NAME_KEYnode_name
"global"Currently used to represent global settings
"default"
"TASK_*_KEY"Strings starting with TASK_ and ending with _KEY