vortex_torch.indexer.elementwise¶
Classes
|
Absolute value of an affine transform. |
|
Affine elementwise transform. |
|
Unary elementwise dispatcher for rank-3 logical tensors |
|
ReLU-style elementwise operator. |
|
Sigmoid elementwise operator with affine argument. |
|
SiLU-style elementwise operator with affine pre-transform. |
- class vortex_torch.indexer.elementwise.Elementwise(alpha=1.0, beta=1.0)[source]¶
Bases:
vOpUnary elementwise dispatcher for rank-3 logical tensors
[S, C, D].This operator dispatches implementation only based on the input format (
x._format). The output tensor has the same logical shape as the input. Optional scalar parametersalphaandbetamay be used by certain elementwise operations.- _impl_map¶
Implementation dispatch table keyed by input format. Each entry maps to
(callable_impl, resolved_output_format).
- op_type¶
The operator type used by the implementation.
- Type:
Optional[ElementwiseOpType]
- output_buffer¶
Preallocated output tensor buffer.
- Type:
Optional[torch.Tensor]
- profile(x, ctx)[source]¶
Validate input, select the implementation based on
x._format, allocate the output buffer, and return avTensorview.- Parameters:
- Returns:
A
vTensorview wrapping the allocated output buffer, using the resolved output format.- Return type:
- Raises:
AssertionError – If input tensor type, rank, or format is invalid.
- execute(x, ctx)[source]¶
Execute the selected implementation into the internal output buffer.
Expected implementation signature:
impl(x, output, op_type, alpha, beta, ctx)
- Parameters:
x (torch.Tensor) – Input tensor on the same device as the output buffer.
ctx (Context) – Execution context.
- Returns:
The output tensor stored in
self.output_buffer.- Return type:
torch.Tensor
- Raises:
AssertionError – If
profilewas not called, or device mismatch occurs.
- class vortex_torch.indexer.elementwise.Relu(alpha=0.0, beta=0.0)[source]¶
Bases:
ElementwiseReLU-style elementwise operator.
This operator applies a thresholded linear function:
\[\begin{split}\operatorname{out}(x) = \begin{cases} x, & x \ge \alpha \\ \beta, & x < \alpha \end{cases}\end{split}\]
- class vortex_torch.indexer.elementwise.Silu(alpha=0.0, beta=0.0)[source]¶
Bases:
ElementwiseSiLU-style elementwise operator with affine pre-transform.
This operator applies:
\[\operatorname{SiLU}_{\alpha,\beta}(x) = \frac{x}{1 + \exp(\beta x + \alpha)}\]When \(\alpha = 0\) and \(\beta = -1\), this reduces to the common SiLU/Swish-like form \(x \, \sigma(x)\) (up to the chosen parameterization in the kernel).
- class vortex_torch.indexer.elementwise.Sigmoid(alpha=0.0, beta=0.0)[source]¶
Bases:
ElementwiseSigmoid elementwise operator with affine argument.
This operator applies:
\[\sigma_{\alpha,\beta}(x) = \frac{1}{1 + \exp(\beta x + \alpha)}\]When \(\alpha = 0\) and \(\beta = -1\), this is the standard logistic sigmoid \(\sigma(x) = 1 / (1 + e^{-x})\).
- class vortex_torch.indexer.elementwise.Add_Mul(alpha=0.0, beta=1.0)[source]¶
Bases:
ElementwiseAffine elementwise transform.
This operator applies a simple affine mapping:
\[\operatorname{out}(x) = \beta x + \alpha\]With the defaults \(\alpha = 0\) and \(\beta = 1\), this is the identity transform \(\operatorname{out}(x) = x\).
- class vortex_torch.indexer.elementwise.Abs(alpha=0.0, beta=1.0)[source]¶
Bases:
ElementwiseAbsolute value of an affine transform.
This operator applies:
\[\operatorname{out}(x) = \lvert \beta x + \alpha \rvert\]With the defaults \(\alpha = 0\) and \(\beta = 1\), this reduces to the standard absolute value \(\lvert x \rvert\).