Note
Click here to download the full example code
Write user defined Operator#
This example shows how to create a simple DPF python plugin holding a single Operator. This Operator called “easy_statistics” computes simple statistics quantities on a scalar Field with the help of numpy. It’s a simple example displaying how routines can be wrapped in DPF python plugins.
Write Operator#
To write the simplest DPF python plugins, a single python script is necessary.
An Operator implementation deriving from
ansys.dpf.core.custom_operator.CustomOperatorBase
and a call to ansys.dpf.core.custom_operator.record_operator()
are the 2 necessary steps to create a plugin.
The “easy_statistics” Operator will take a Field in input and return
the first quartile, the median,
the third quartile and the variance. The python Operator and its recording seat in the
file plugins/easy_statistics.py. This file easy_statistics.py is downloaded
and displayed here:
from ansys.dpf.core import examples
GITHUB_SOURCE_URL = "https://github.com/pyansys/pydpf-core/" \
"raw/examples/first_python_plugins/python_plugins"
EXAMPLE_FILE = GITHUB_SOURCE_URL + "/easy_statistics.py"
operator_file_path = examples.downloads._retrieve_file(
EXAMPLE_FILE, "easy_statistics.py", "python_plugins"
)
with open(operator_file_path, "r") as f:
for line in f.readlines():
print('\t\t\t' + line)
import numpy as np
from ansys.dpf import core as dpf
from ansys.dpf.core.custom_operator import CustomOperatorBase, record_operator
from ansys.dpf.core.operator_specification import CustomSpecification, SpecificationProperties, \
PinSpecification
class EasyStatistics(CustomOperatorBase):
@property
def name(self):
return "easy_statistics"
@property
def specification(self) -> CustomSpecification:
spec = CustomSpecification()
spec.description = "Compute the first quartile, the median, the third quartile and the variance of a scalar Field with numpy"
spec.inputs = {0: PinSpecification("field", [dpf.Field, dpf.FieldsContainer], "scalar Field on which the statistics quantities is computed.")}
spec.outputs = {
0: PinSpecification("first_quartile", [float]),
1: PinSpecification("median", [float]),
2: PinSpecification("third_quartile", [float]),
3: PinSpecification("variance", [float]),
}
spec.properties = SpecificationProperties("easy statistics", "math")
return spec
def run(self):
field = self.get_input(0, dpf.Field)
if field is None:
field = self.get_input(0, dpf.FieldsContainer)[0]
# compute stats
first_quartile_val = np.quantile(field.data, 0.25)
median_val = np.quantile(field.data, 0.5)
third_quartile_val = np.quantile(field.data, 0.75)
variance_val = np.var(field.data)
self.set_output(0, first_quartile_val)
self.set_output(1, median_val)
self.set_output(2, third_quartile_val)
self.set_output(3, float(variance_val))
self.set_succeeded()
def load_operators(*args):
record_operator(EasyStatistics, *args)
Load Plugin#
Once a python plugin is written, it can be loaded with the function
ansys.dpf.core.core.load_library()
taking as first argument the path to the directory of the plugin, as second argument
py_
+ the name of
the python script, and as last argument the function’s name used to record operators.
import os
from ansys.dpf import core as dpf
from ansys.dpf.core import examples
# python plugins are not supported in process
dpf.start_local_server(config=dpf.AvailableServerConfigs.GrpcServer)
operator_server_file_path = dpf.upload_file_in_tmp_folder(operator_file_path)
dpf.load_library(os.path.dirname(operator_server_file_path), "py_easy_statistics", "load_operators")
'py_easy_statistics successfully loaded'
Once the Operator loaded, it can be instantiated with:
new_operator = dpf.Operator("easy_statistics")
To use this new Operator, a workflow computing the norm of the displacement
is connected to the “easy_statistics” Operator.
Methods of the class easy_statistics
are dynamically added thanks to the Operator’s
specification defined in the plugin.
![digraph foo {
graph [pad="0.5", nodesep="0.3", ranksep="0.3"]
node [shape=box, style=filled, fillcolor="#ffcc00", margin="0"];
rankdir=LR;
splines=line;
ds [label="ds", shape=box, style=filled, fillcolor=cadetblue2];
ds -> displacement [style=dashed];
displacement -> norm;
norm -> easy_statistics;
}](../../_images/graphviz-2589e0643b5762e78a977908d556554f5ed3c0aa.png)
Use the Custom Operator#
ds = dpf.DataSources(dpf.upload_file_in_tmp_folder(examples.static_rst))
displacement = dpf.operators.result.displacement(data_sources=ds)
norm = dpf.operators.math.norm(displacement)
new_operator.inputs.connect(norm)
print("first quartile is", new_operator.outputs.first_quartile())
print("median is", new_operator.outputs.median())
print("third quartile is", new_operator.outputs.third_quartile())
print("variance is", new_operator.outputs.variance())
first quartile is 0.0
median is 7.491665033689507e-09
third quartile is 1.4276663319275634e-08
variance is 3.054190175494998e-17
Total running time of the script: ( 0 minutes 3.735 seconds)