Probing Inter Model Metrics

Available Metrics

Selectivity

Metric Name: probe_ably.core.metrics.selectivity.SelectivityMetric

class probe_ably.core.metrics.selectivity.SelectivityMetric[source]
calculate_metrics(targets1: numpy.array, targets2: numpy.array, predicitons1: numpy.array, predicitons2: numpy.array, **kwargs)float[source]

Calculates the selectivity metric

Parameters
  • targets1 (np.array) – Gold labels of first set of data

  • targets2 (np.array) – Gold labels of second set of data

  • predicitons1 (np.array) – Predictions of first set of data

  • predicitons2 (np.array) – Predictions of second set of data

Returns

Selectivity score

Return type

float

metric_name()str[source]

Returns the name of metric. Used for visualization purposes

Returns

Metric name

Return type

str

Implementing New Metrics

You need to extend and implement the following class

class probe_ably.core.metrics.abstract_inter_model_metric.AbstractInterModelMetric[source]
abstract calculate_metrics(targets1, targets2, predicitons1, predicitons2, **kwargs)float[source]

Abstract method that calcuate the inter model metric

Parameters
  • targets1 (np.array) – Gold labels of first set of data

  • targets2 (np.array) – Gold labels of second set of data

  • predicitons1 (np.array) – Predictions of first set of data

  • predicitons2 (np.array) – Predictions of second set of data

Returns

Inter model metric score

Return type

float

abstract metric_name()[source]

Abstract method returns the name of metric. Used for visualization purposes

Returns

Metric name

Return type

str

Once implemented you can use the full class name in the configuration file