teehr.SignatureMetrics#

class teehr.SignatureMetrics[source]#

Bases: object

Define and customize signature metrics.

Notes

Signature metrics operate on a single field. Available signature metrics are:

  • Average

  • Count

  • MaxValueTime

  • Maximum

  • Minimum

  • Sum

  • Variance

Methods

class Average(*, return_type: str | ~pyspark.sql.types.ArrayType | ~pyspark.sql.types.MapType = 'float', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, transform: ~teehr.models.metrics.basemodels.TransformEnum = None, output_field_name: str = 'average', func: ~typing.Callable = <function average>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Average', 'optimal_value': None, 'short_name': 'average', 'value_range': None})#

Bases: DeterministicBasemodel

Average.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “average”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.average().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class Count(*, return_type: str | ~pyspark.sql.types.ArrayType | ~pyspark.sql.types.MapType = 'float', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, output_field_name: str = 'count', func: ~typing.Callable = <function count>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Count', 'optimal_value': None, 'short_name': 'count', 'value_range': None})#

Bases: DeterministicBasemodel

Count.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “primary_count”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.count().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class MaxValueTime(*, return_type: str = 'timestamp', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, transform: ~teehr.models.metrics.basemodels.TransformEnum = None, output_field_name: str = 'max_value_time', func: ~typing.Callable = <function max_value_time>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value', 'value_time'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Max Value Time', 'optimal_value': None, 'short_name': 'max_val_time', 'value_range': None})#

Bases: DeterministicBasemodel

Max Value Time.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “max_value_time”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.max_value_time().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class Maximum(*, return_type: str | ~pyspark.sql.types.ArrayType | ~pyspark.sql.types.MapType = 'float', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, transform: ~teehr.models.metrics.basemodels.TransformEnum = None, output_field_name: str = 'maximum', func: ~typing.Callable = <function maximum>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Maximum', 'optimal_value': None, 'short_name': 'maximum', 'value_range': None})#

Bases: DeterministicBasemodel

Maximum.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “maximum”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.maximum().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class Minimum(*, return_type: str | ~pyspark.sql.types.ArrayType | ~pyspark.sql.types.MapType = 'float', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, transform: ~teehr.models.metrics.basemodels.TransformEnum = None, output_field_name: str = 'minimum', func: ~typing.Callable = <function minimum>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Minimum', 'optimal_value': None, 'short_name': 'minimum', 'value_range': None})#

Bases: DeterministicBasemodel

Minimum.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “primary_minimum”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.minimum().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class Sum(*, return_type: str | ~pyspark.sql.types.ArrayType | ~pyspark.sql.types.MapType = 'float', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, transform: ~teehr.models.metrics.basemodels.TransformEnum = None, output_field_name: str = 'sum', func: ~typing.Callable = <function sum>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Sum', 'optimal_value': None, 'short_name': 'sum', 'value_range': None})#

Bases: DeterministicBasemodel

Sum.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “sum”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.sum().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].

class Variance(*, return_type: str | ~pyspark.sql.types.ArrayType | ~pyspark.sql.types.MapType = 'float', unpack_results: bool = False, unpack_function: ~typing.Callable = <function unpack_sdf_dict_columns>, bootstrap: ~teehr.models.metrics.basemodels.BootstrapBasemodel = None, transform: ~teehr.models.metrics.basemodels.TransformEnum = None, output_field_name: str = 'variance', func: ~typing.Callable = <function variance>, input_field_names: str | ~teehr.models.str_enum.StrEnum | ~typing.List[str | ~teehr.models.str_enum.StrEnum] = ['primary_value'], attrs: ~typing.Dict = {'category': MetricCategories.Signature, 'display_name': 'Variance', 'optimal_value': None, 'short_name': 'variance', 'value_range': None})#

Bases: DeterministicBasemodel

Variance.

Parameters:
  • bootstrap (DeterministicBasemodel) – The bootstrap model, by default None.

  • transform (TransformEnum) – The transformation to apply to the data, by default None.

  • output_field_name (str) – The output field name, by default “variance”.

  • func (Callable) – The function to apply to the data, by default signature_funcs.variance().

  • input_field_names (Union[str, StrEnum, List[Union[str, StrEnum]]]) – The input field names, by default [“primary_value”].

  • attrs (Dict) – The static attributes for the metric.

model_config: ClassVar[ConfigDict] = {'arbitrary_types_allowed': True, 'extra': 'forbid', 'validate_assignment': True}#

Configuration for the model, should be a dictionary conforming to [ConfigDict][pydantic.config.ConfigDict].