Skip to content

Solar Clipping State

Overview

FeatureCalcSolarClippingState determines whether a Huawei solar inverter is clipping at each 5-minute timestamp — that is, operating at or near its maximum possible AC output and unable to convert all available DC power. The result is a binary feature (0 = not clipping, 1 = clipping).

The maximum possible power at any moment is constrained by two physical limits, both derived from the Huawei inverter data sheet:

  • Temperature derating: the inverter reduces maximum output as ambient temperature rises.
  • P-Q capability curve: the inverter's apparent power capacity (S_max) depends on grid voltage, and the maximum active power is limited by P_max = sqrt(S_max² - Q²).

Calculation Logic

1. Data Preparation

Features for the inverter and associated weather station are fetched from Bazefield (_b# suffix), rounded to 5-minute timestamps within ±2 minutes tolerance. All feature columns are forward-filled then backward-filled to minimize gaps.

2. Average Grid Voltage

The three line-to-line grid voltages are averaged:

Text Only
AverageGridVoltage = mean(GridLineABVoltage, GridLineBCVoltage, GridLineCAVoltage)

3. Active Power Reference Selection

The active power used for clipping comparison depends on the inverter state:

  • If curtailed (CurtailmentState = 1) or stopped (IEC-OperationState = 1): use ActivePowerTheoretical_5min.AVG (the power that would have been produced).
  • Otherwise: use ActivePower_5min.MAX (the actual maximum within the 5-minute window).

4. Temperature Derating

Based on the Huawei inverter data sheet (piecewise linear):

Temperature Range Max Power After Derating
≤ 30 °C 330 kW (constant)
30 – 50 °C Linear: 330 → 270 kW
50 – 60 °C Linear: 270 → 150 kW
> 60 °C 150 kW (constant)

5. P-Q Capability Curve

Maximum active power from the P-Q curve depends on grid voltage (per-unit) and measured reactive power:

Text Only
pu_voltage = AverageGridVoltage / nominal_ac_voltage

S_max = 330.0 kVA  (if pu_voltage ≥ 1.0)
      = 313.5 kVA  (if 0.95 ≤ pu_voltage < 1.0)
      = 297.0 kVA  (if 0.90 ≤ pu_voltage < 0.95)
      = 280.5 kVA  (if pu_voltage < 0.90)

P_max = sqrt(max(S_max² - Q², 0))

If |Q| > 198 kVAR, P_max = 0 (reactive power limit exceeded).

6. Maximum Possible Power Selection

The calculator selects the limit (temperature derating or P-Q curve) that is closest to the actual active power, rather than always taking the minimum. This accounts for real-world inverter behavior where only one constraint is active at a time:

Text Only
max_possible = argmin(|temperature_limit - ActivePower|, |PQ_limit - ActivePower|)

7. Clipping State Determination

Text Only
ClippingState = 1  if |ActivePower - max_possible| ≤ 0.005 × max_possible
ClippingState = 0  otherwise

The 0.5% tolerance accounts for measurement noise and minor control deviations.


Database Requirements

Feature Attribute

Attribute Value
server_calc_type solar_clipping_state

Object Attributes

Attribute Required Description
reference_weather_stations Yes Dict with at least a "complete_ws" key naming the weather station object for ambient temperature.
nominal_ac_voltage Yes Nominal AC voltage of the inverter (V). Used to compute per-unit voltage for the P-Q curve.

Features (inverter — from Bazefield)

Feature Description
ActivePower_5min.MAX Maximum active power within 5-minute window (kW)
ActivePowerTheoretical_5min.AVG Theoretical power (kW) — used when curtailed or stopped
GridLineABVoltage_5min.AVG Line AB voltage (V)
GridLineBCVoltage_5min.AVG Line BC voltage (V)
GridLineCAVoltage_5min.AVG Line CA voltage (V)
ReactivePower_5min.AVG Reactive power (kVAR)
CurtailmentState_5min.REP Curtailment state flag
IEC-OperationState_5min.REP IEC operation state flag

Features (complete weather station — from Bazefield)

Feature Description
AmbTemp_5min.AVG Ambient temperature (°C) — used for temperature derating

Class Definition

FeatureCalcSolarClippingState(object_name, feature)

Base class for solar energy loss due to open strings.

For this class to work, the feature must have the attribute 'server_calc_type' set to 'solar_clipping_state'.

Parameters:

  • object_name

    (str) –

    Name of the object for which the feature is calculated. It must exist in performance_db.

  • feature

    (str) –

    Feature of the object that is calculated. It must exist in performance_db.

Source code in echo_energycalc/feature_calc_solar_clipping_state.py
Python
def __init__(
    self,
    object_name: str,
    feature: str,
) -> None:
    """
    Class used to calculate ClippingState Feature for solar assets.

    For this class to work, the feature must have the attribute 'server_calc_type' set to 'solar_clipping_state'.

    Parameters
    ----------
    object_name : str
        Name of the object for which the feature is calculated. It must exist in performance_db.
    feature : str
        Feature of the object that is calculated. It must exist in performance_db.
    """
    # initialize parent class
    super().__init__(object_name, feature)

    # Defining which object attributes are required for the calculation.
    self._add_requirement(
        RequiredObjectAttributes(
            {
                self.object: [
                    "reference_weather_stations",
                    "nominal_ac_voltage",
                ],
            },
        ),
    )
    self._fetch_requirements()

    # Getting the complete weather station name for the specif object.
    complete_ws = self._requirement_data("RequiredObjectAttributes")[self.object]["reference_weather_stations"]["complete_ws"]

    # Defining the features that will be required for the calculation.
    reference_features = [
        "ActivePower_5min.MAX",
        "GridLineABVoltage_5min.AVG",
        "GridLineBCVoltage_5min.AVG",
        "GridLineCAVoltage_5min.AVG",
        "ReactivePower_5min.AVG",
        "CurtailmentState_5min.REP",
        "ActivePowerTheoretical_5min.AVG",
        "IEC-OperationState_5min.REP",
    ]
    # Adding suffix _b# to features -> necessary to acquire data from bazefield
    features = {
        self.object: [f"{feat}_b#" for feat in reference_features],
        complete_ws: ["AmbTemp_5min.AVG_b#"],
    }
    self._add_requirement(RequiredFeatures(features=features))

feature property

Feature that is calculated. This will be defined in the constructor and cannot be changed.

Returns:

  • str

    Name of the feature that is calculated.

name property

Name of the feature calculator. Is defined in child classes of FeatureCalculator.

This must be equal to the "server_calc_type" attribute of the feature in performance_db.

Returns:

  • str

    Name of the feature calculator.

object property

Object for which the feature is calculated. This will be defined in the constructor and cannot be changed.

Returns:

  • str

    Object name for which the feature is calculated.

requirements property

List of requirements of the feature calculator. Is defined in child classes of FeatureCalculator.

Returns:

  • dict[str, list[CalculationRequirement]]

    Dict of requirements.

    The keys are the names of the classes of the requirements and the values are lists of requirements of that class.

    For example: {"RequiredFeatures": [RequiredFeatures(...), RequiredFeatures(...)], "RequiredObjects": [RequiredObjects(...)]}

result property

Result of the calculation. This is None until the method "calculate" is called.

Returns:

  • DataFrame | None

    Polars DataFrame with a "timestamp" column and one or more feature value columns. None until calculate is called.

calculate(period, save_into=None, cached_data=None, **kwargs)

Run the calculation for the given period and optionally save the result.

Calls :meth:_compute to get the result, stores it in :attr:result, then calls :meth:save. Subclasses should implement :meth:_compute instead of overriding this method.

Parameters:

  • period

    (DateTimeRange) –

    Period for which the feature will be calculated.

  • save_into

    (Literal['all', 'performance_db'] | None, default: None ) –
    • "all": save in performance_db and bazefield.
    • "performance_db": save only in performance_db.
    • None: do not save.

    By default None.

  • cached_data

    (DataFrame | None, default: None ) –

    Polars DataFrame with features already fetched/calculated. Passed to _compute to enable chained calculations without re-querying performance_db. By default None.

  • **kwargs

    Forwarded to :meth:save.

Returns:

  • DataFrame

    Polars DataFrame with a "timestamp" column and one or more feature value columns.

Source code in echo_energycalc/feature_calc_core.py
Python
def calculate(
    self,
    period: DateTimeRange,
    save_into: Literal["all", "performance_db"] | None = None,
    cached_data: pl.DataFrame | None = None,
    **kwargs,
) -> pl.DataFrame:
    """
    Run the calculation for the given period and optionally save the result.

    Calls :meth:`_compute` to get the result, stores it in :attr:`result`,
    then calls :meth:`save`. Subclasses should implement :meth:`_compute` instead
    of overriding this method.

    Parameters
    ----------
    period : DateTimeRange
        Period for which the feature will be calculated.
    save_into : Literal["all", "performance_db"] | None, optional
        - ``"all"``: save in performance_db and bazefield.
        - ``"performance_db"``: save only in performance_db.
        - ``None``: do not save.

        By default None.
    cached_data : pl.DataFrame | None, optional
        Polars DataFrame with features already fetched/calculated. Passed to
        ``_compute`` to enable chained calculations without re-querying
        performance_db. By default None.
    **kwargs
        Forwarded to :meth:`save`.

    Returns
    -------
    pl.DataFrame
        Polars DataFrame with a ``"timestamp"`` column and one or more feature value columns.
    """
    result = self._compute(period, cached_data=cached_data)
    self._result = result
    self.save(save_into=save_into, **kwargs)
    return result

save(save_into=None, **kwargs)

Method to save the calculated feature values in performance_db.

Parameters:

  • save_into

    (Literal['all', 'performance_db'] | None, default: None ) –

    Argument that will be passed to the method "save". The options are: - "all": The feature will be saved in performance_db and bazefield. - "performance_db": the feature will be saved only in performance_db. - None: The feature will not be saved.

    By default None.

  • **kwargs

    (dict, default: {} ) –

    Not being used at the moment. Here only for compatibility.

Source code in echo_energycalc/feature_calc_core.py
Python
def save(
    self,
    save_into: Literal["all", "performance_db"] | None = None,
    **kwargs,  # noqa: ARG002
) -> None:
    """
    Method to save the calculated feature values in performance_db.

    Parameters
    ----------
    save_into : Literal["all", "performance_db"] | None, optional
        Argument that will be passed to the method "save". The options are:
        - "all": The feature will be saved in performance_db and bazefield.
        - "performance_db": the feature will be saved only in performance_db.
        - None: The feature will not be saved.

        By default None.
    **kwargs : dict, optional
        Not being used at the moment. Here only for compatibility.
    """
    # checking arguments
    if not isinstance(save_into, str | type(None)):
        raise TypeError(f"save_into must be a string or None, not {type(save_into)}")
    if isinstance(save_into, str) and save_into not in ["all", "performance_db"]:
        raise ValueError(f"save_into must be 'all', 'performance_db' or None, not {save_into}")

    # checking if calculation was done
    if self.result is None:
        raise ValueError(
            "The calculation was not done. Please call 'calculate' before calling 'save'.",
        )

    if save_into is None:
        return

    upload_to_bazefield = save_into == "all"

    if not isinstance(self.result, pl.DataFrame):
        raise TypeError(f"result must be a polars DataFrame, not {type(self.result)}.")
    if "timestamp" not in self.result.columns:
        raise ValueError("result DataFrame must contain a 'timestamp' column.")

    # rename feature columns to "object@feature" format expected by perfdb polars insert
    feat_cols = [c for c in self.result.columns if c != "timestamp"]
    result_pl = self.result.rename({col: f"{self.object}@{col}" for col in feat_cols})

    self._perfdb.features.values.series.insert(
        df=result_pl,
        on_conflict="update",
        bazefield_upload=upload_to_bazefield,
    )