Solar Module Temperature Loss¶
Overview¶
SolarEnergyLossModuleTemperature calculates the daily energy loss attributable to module temperature effects using a pre-trained model. The model outputs a loss factor (fraction of theoretical power lost due to temperature), which is multiplied by the theoretical power to produce the hourly loss, then aggregated to a daily average.
The model is trained on historical data and stored in performance_db. The training script is at manual_routines\postgres_fit_solar_power_predict on the Performance Server.
Note
Module temperature above 25 °C causes photovoltaic efficiency loss. This loss is always non-negative — gains (temperature below 25 °C) are clipped to zero.
Calculation Logic¶
1. Feature Acquisition¶
Fetches:
ModuleTempCommOk_5min.AVGfrom all simple weather stations inreference_weather_stations["simple_ws"].ActivePowerTheoretical_10min.AVGfrom the SPE object itself.
All timestamps are rounded to 5-minute boundaries within ±2 min tolerance.
2. Temperature Averaging¶
If multiple simple weather stations are configured, module temperature is averaged across all stations per timestamp.
3. Complete Day Filtering¶
Data is resampled to hourly frequency. Only days with exactly 24 hourly observations are kept. Incomplete days are discarded and logged as warnings.
Missing temperature values are filled with 25 °C (standard test condition — zero loss assumption).
4. Model Prediction¶
The model is applied to hourly module temperature values:
loss_factor = model.predict(module_temperature) # fraction, e.g. 0.03 = 3% loss
hourly_loss_kW = ActivePowerTheoretical × loss_factor
Negative loss values are clipped to 0 — the model may occasionally predict a small gain, which is discarded.
5. Night Zeroing¶
Uses pvlib with the object's latitude and longitude to identify night periods. Hourly loss values during nighttime are set to 0.0 kW.
Forward-fill is applied to remaining null hourly losses after night zeroing.
6. Daily Aggregation¶
Hourly loss values (kW) are averaged to produce daily loss values (average kW over the day):
daily_loss_kW_avg = mean(hourly_loss_kW per day)
Database Requirements¶
Feature Attribute¶
| Attribute | Value |
|---|---|
server_calc_type |
solar_energy_loss_modtemperature |
feature_options_json |
JSON object — see below |
feature_options_json Schema¶
| Key | Type | Required | Description |
|---|---|---|---|
calc_model_type |
string | Yes | Exact model type (e.g., "modtemperature_fit"). |
model_name |
string | Yes | Substring of the model name in performance_db. |
bazefield_features |
boolean | Yes | If true, fetches features from Bazefield. |
Example:
{
"calc_model_type": "modtemperature_fit",
"model_name": "modtemperature_regression",
"bazefield_features": true
}
Object Attributes¶
| Attribute | Required | Description |
|---|---|---|
reference_weather_stations |
Yes | Dict with "simple_ws" key (string or list of strings) naming weather station(s) for module temperature. |
latitude |
Yes | Geographic latitude (decimal degrees). Used for night masking via pvlib. |
longitude |
Yes | Geographic longitude (decimal degrees). Used for night masking via pvlib. |
Calculation Model¶
| Requirement | Description |
|---|---|
| Model type | Must match calc_model_type exactly |
| Model name | Must contain model_name as a substring |
| Input | Hourly average module temperature (°C), shape (n, 1) |
| Output | Hourly temperature loss factor (fraction of theoretical power) |
Features¶
| Feature | Object | Description |
|---|---|---|
ModuleTempCommOk_5min.AVG |
Simple weather station(s) | Module temperature (°C). Fetched with _b# if bazefield_features = true. |
ActivePowerTheoretical_10min.AVG |
SPE object | Theoretical power (kW). Multiplied by loss factor. |
Class Definition¶
SolarEnergyLossModuleTemperature(object_name, feature)
¶
Base class for solar energy loss/gain from Irradiance.
For this class to work, the feature must have the attribute feature_options_json with the following keys:
- 'calc_model_type': type of the model that will be used to calculate the feature. It must match the type of the model in performance_db.
- 'model_name': name of the model that will be used to calculate the feature.
- 'bazefield_features': bool indicating if the required features needs to be acquired from bazefield.
Parameters:
-
(object_name¶str) –Name of the object for which the feature is calculated. It must exist in performance_db.
-
(feature¶str) –Feature of the object that is calculated. It must exist in performance_db.
Source code in echo_energycalc/solar_energy_loss_mod_temperature.py
def __init__(self, object_name: str, feature: str) -> None:
"""
Class used to calculate features that depend on a PredictiveModel.
For this class to work, the feature must have the attribute `feature_options_json` with the following keys:
- 'calc_model_type': type of the model that will be used to calculate the feature. It must match the type of the model in performance_db.
- 'model_name': name of the model that will be used to calculate the feature.
- 'bazefield_features': bool indicating if the required features needs to be acquired from bazefield.
Parameters
----------
object_name : str
Name of the object for which the feature is calculated. It must exist in performance_db.
feature : str
Feature of the object that is calculated. It must exist in performance_db.
"""
# initialize parent class
super().__init__(object_name, feature)
# load feature options, model requirements, and deserialize joblib model
self._setup_model_from_feature_options()
# defining required features
simple_ws = self._requirement_data("RequiredObjectAttributes")[self.object]["reference_weather_stations"]["simple_ws"]
features = {ws: ["ModuleTempCommOk_5min.AVG"] for ws in simple_ws}
features[self.object] = ["ActivePowerTheoretical_10min.AVG"]
# Adding suffix _b# to features if bazefield_features is True
if self._feature_attributes["feature_options_json"].get("bazefield_features", False):
features = {obj: [f"{feat}_b#" for feat in feats] for obj, feats in features.items()}
self._add_requirement(RequiredFeatures(features=features))
feature
property
¶
Feature that is calculated. This will be defined in the constructor and cannot be changed.
Returns:
-
str–Name of the feature that is calculated.
name
property
¶
Name of the feature calculator. Is defined in child classes of FeatureCalculator.
This must be equal to the "server_calc_type" attribute of the feature in performance_db.
Returns:
-
str–Name of the feature calculator.
object
property
¶
Object for which the feature is calculated. This will be defined in the constructor and cannot be changed.
Returns:
-
str–Object name for which the feature is calculated.
requirements
property
¶
List of requirements of the feature calculator. Is defined in child classes of FeatureCalculator.
Returns:
-
dict[str, list[CalculationRequirement]]–Dict of requirements.
The keys are the names of the classes of the requirements and the values are lists of requirements of that class.
For example:
{"RequiredFeatures": [RequiredFeatures(...), RequiredFeatures(...)], "RequiredObjects": [RequiredObjects(...)]}
result
property
¶
Result of the calculation. This is None until the method "calculate" is called.
Returns:
-
DataFrame | None–Polars DataFrame with a
"timestamp"column and one or more feature value columns. None untilcalculateis called.
calculate(period, save_into=None, cached_data=None, **kwargs)
¶
Method that will calculate the feature.
This code will do the following: 1. Get module temperature data from the weather stations associated with the object. 2. Average the module temperature data from all weather stations. 3. Resample the data to hourly frequency, keeping only complete days (24 hours of data). 4. Predict the temperature loss using the model and clip negative values, as there is no gain associated with module temperature. The model will return a loss percentage that will be multiplied by the theoretical power. The result is the final loss in kW. 5. Resample data to daily frequency resulting in kWh/day values.
Parameters:
-
(period¶DateTimeRange) –Period for which the feature will be calculated.
-
(save_into¶Literal['all', 'performance_db'] | None, default:None) –Argument that will be passed to the method "save". The options are: - "all": The feature will be saved in performance_db and bazefield. - "performance_db": the feature will be saved only in performance_db. - None: The feature will not be saved.
By default None.
-
(cached_data¶DataFrame | None, default:None) –DataFrame with features already queried/calculated. This is useful to avoid needing to query all the data again from performance_db, making chained calculations a lot more efficient. By default None
-
(**kwargs¶dict, default:{}) –Additional arguments that will be passed to the "save" method.
Returns:
-
DataFrame–Polars DataFrame with the calculated feature.
Source code in echo_energycalc/solar_energy_loss_mod_temperature.py
def calculate(
self,
period: DateTimeRange,
save_into: Literal["all", "performance_db"] | None = None,
cached_data: pl.DataFrame | None = None,
**kwargs,
) -> pl.DataFrame:
"""
Method that will calculate the feature.
This code will do the following:
1. Get module temperature data from the weather stations associated with the object.
2. Average the module temperature data from all weather stations.
3. Resample the data to hourly frequency, keeping only complete days (24 hours of data).
4. Predict the temperature loss using the model and clip negative values, as there is no gain associated with module temperature. The model will return a loss percentage that will be multiplied by the theoretical power. The result is the final loss in kW.
5. Resample data to daily frequency resulting in kWh/day values.
Parameters
----------
period : DateTimeRange
Period for which the feature will be calculated.
save_into : Literal["all", "performance_db"] | None, optional
Argument that will be passed to the method "save". The options are:
- "all": The feature will be saved in performance_db and bazefield.
- "performance_db": the feature will be saved only in performance_db.
- None: The feature will not be saved.
By default None.
cached_data : DataFrame | None, optional
DataFrame with features already queried/calculated. This is useful to avoid needing to query all the data again from performance_db, making chained calculations a lot more efficient.
By default None
**kwargs : dict, optional
Additional arguments that will be passed to the "save" method.
Returns
-------
pl.DataFrame
Polars DataFrame with the calculated feature.
"""
t0 = perf_counter()
# getting feature values
self._fetch_requirements(
period=period,
reindex=None,
round_timestamps={"freq": timedelta(minutes=5), "tolerance": timedelta(minutes=2)},
cached_data=cached_data,
)
# getting polars DataFrame with feature values
raw_df = self._requirement_data("RequiredFeatures")
t1 = perf_counter()
# --------------- Adjusting Dataframe structure
df = self._average_weather_station_features(raw_df, "ModuleTempCommOk_5min.AVG", keep_object_cols=True)
# Adjusting temperature values to 25 if NaN
df = df.with_columns(pl.col("ModuleTempCommOk_5min.AVG").fill_null(25.0))
# Rename to match model input
df = df.rename({"ModuleTempCommOk_5min.AVG": "TArray"})
# ------------ Resample to hourly frequency
# group_by_dynamic on "timestamp" with every="1h", agg mean
df_hourly = df.sort("timestamp").group_by_dynamic("timestamp", every="1h").agg(
pl.col("TArray").mean(),
pl.col("ActivePowerTheoretical_10min.AVG").mean(),
)
# Count hours per day to find complete days (24 hours)
df_hourly = df_hourly.with_columns(pl.col("timestamp").dt.truncate("1d").alias("_date"))
daily_counts = df_hourly.group_by("_date").agg(pl.len().alias("_hour_count"))
complete_days = daily_counts.filter(pl.col("_hour_count") == 24)["_date"]
df_complete_days = df_hourly.filter(pl.col("_date").is_in(complete_days)).drop("_date")
# Logging discarded days due to incomplete data
discarded = set(df_hourly["_date"].to_list()) - set(complete_days.to_list())
if discarded:
logger.warning(
f"{self.object} - {self.feature} - {period}: Discarded days due to less than 24 hours of data: {', '.join(str(d) for d in sorted(discarded))}",
)
t2 = perf_counter()
# ------------- Applying model to predict temperature loss
if df_complete_days.height > 0:
x = df_complete_days["TArray"].to_numpy().reshape(-1, 1)
model_result = self._model.predict(x)
df_complete_days = df_complete_days.with_columns([
pl.Series("model_result", model_result),
])
df_complete_days = df_complete_days.with_columns(
(pl.col("ActivePowerTheoretical_10min.AVG") * pl.col("model_result"))
.clip(lower_bound=0.0)
.alias("Temp_Loss_10min.AVG"),
)
# During night, loss is 0
obj_attrs = self._requirement_data("RequiredObjectAttributes")[self.object]
is_night = self._get_night_mask(df_complete_days["timestamp"], obj_attrs["latitude"], obj_attrs["longitude"])
df_complete_days = df_complete_days.with_columns(
pl.when(is_night).then(0.0).otherwise(pl.col("Temp_Loss_10min.AVG")).alias("Temp_Loss_10min.AVG"),
)
# Forward fill remaining NaN values during daytime
df_complete_days = df_complete_days.with_columns(
pl.col("Temp_Loss_10min.AVG").forward_fill(),
)
# Resample to daily values (mean of hourly loss = kWavg)
result_daily = (
df_complete_days.with_columns(pl.col("timestamp").dt.truncate("1d").alias("_date"))
.group_by("_date")
.agg(pl.col("Temp_Loss_10min.AVG").mean().alias(self.feature))
.sort("_date")
.rename({"_date": "timestamp"})
)
else:
result_daily = pl.DataFrame({"timestamp": pl.Series([], dtype=pl.Date), self.feature: pl.Series([], dtype=pl.Float64)})
# Cast timestamp to Datetime if it came out as Date
if result_daily["timestamp"].dtype == pl.Date:
result_daily = result_daily.with_columns(pl.col("timestamp").cast(pl.Datetime("ms")))
t3 = perf_counter()
self._result = result_daily
# saving results
self.save(save_into=save_into, **kwargs)
logger.debug(
f"{self.object} - {self.feature} - {period}: Requirements during calc {t1 - t0:.2f}s - Data adjustments {t2 - t1:.2f}s - Model prediction {t3 - t2:.2f}s - Saving data {perf_counter() - t3:.2f}s",
)
return result_daily
save(save_into=None, **kwargs)
¶
Method to save the calculated feature values in performance_db.
Parameters:
-
(save_into¶Literal['all', 'performance_db'] | None, default:None) –Argument that will be passed to the method "save". The options are: - "all": The feature will be saved in performance_db and bazefield. - "performance_db": the feature will be saved only in performance_db. - None: The feature will not be saved.
By default None.
-
(**kwargs¶dict, default:{}) –Not being used at the moment. Here only for compatibility.
Source code in echo_energycalc/feature_calc_core.py
def save(
self,
save_into: Literal["all", "performance_db"] | None = None,
**kwargs, # noqa: ARG002
) -> None:
"""
Method to save the calculated feature values in performance_db.
Parameters
----------
save_into : Literal["all", "performance_db"] | None, optional
Argument that will be passed to the method "save". The options are:
- "all": The feature will be saved in performance_db and bazefield.
- "performance_db": the feature will be saved only in performance_db.
- None: The feature will not be saved.
By default None.
**kwargs : dict, optional
Not being used at the moment. Here only for compatibility.
"""
# checking arguments
if not isinstance(save_into, str | type(None)):
raise TypeError(f"save_into must be a string or None, not {type(save_into)}")
if isinstance(save_into, str) and save_into not in ["all", "performance_db"]:
raise ValueError(f"save_into must be 'all', 'performance_db' or None, not {save_into}")
# checking if calculation was done
if self.result is None:
raise ValueError(
"The calculation was not done. Please call 'calculate' before calling 'save'.",
)
if save_into is None:
return
upload_to_bazefield = save_into == "all"
if not isinstance(self.result, pl.DataFrame):
raise TypeError(f"result must be a polars DataFrame, not {type(self.result)}.")
if "timestamp" not in self.result.columns:
raise ValueError("result DataFrame must contain a 'timestamp' column.")
# rename feature columns to "object@feature" format expected by perfdb polars insert
feat_cols = [c for c in self.result.columns if c != "timestamp"]
result_pl = self.result.rename({col: f"{self.object}@{col}" for col in feat_cols})
self._perfdb.features.values.series.insert(
df=result_pl,
on_conflict="update",
bazefield_upload=upload_to_bazefield,
)