Hybrid LSTM–Transformer Framework for Accurate Indoor Operative Temperature Prediction in HVAC-Controlled Buildings

Accurate prediction of indoor operative temperature is essential for improving HVAC system performance, enhancing occupant comfort, and reducing energy consumption in buildings. Operative temperature represents the combined effect of air temperature and the mean radiant temperature of surrounding surfaces as experienced by occupants. In highly controlled environments such as sentry buildings, precise thermal forecasting enables more responsive and energy-efficient climate control strategies. This study proposes a hybrid deep learning framework to improve the accuracy and robustness of indoor operative temperature prediction.

Concept of Operative Temperature and Its Role in Thermal Comfort

Operative temperature is widely used as a key indicator of indoor thermal comfort because it integrates both air temperature and radiative heat exchange between occupants and surrounding surfaces. Traditional temperature prediction approaches often focus only on air temperature, overlooking the influence of surface temperatures and dynamic indoor conditions. Reliable prediction of operative temperature therefore provides a more realistic representation of indoor thermal perception and supports advanced building energy management strategies.

Hybrid LSTM–Transformer Deep Learning Architecture

The proposed Hybrid LSTM–Transformer model integrates the strengths of Long Short-Term Memory networks and Transformer architectures. The LSTM component effectively captures short-term temporal dependencies and nonlinear dynamics within time-series environmental data. In contrast, the Transformer mechanism models long-term dependencies and complex interactions among variables through attention-based learning, enabling the model to better understand evolving thermal patterns within indoor environments.

Temporal Attention Pooling for Enhanced Feature Extraction

To further improve predictive performance, the framework incorporates a Temporal Attention Pooling mechanism. This component highlights critical time steps within the input sequence and assigns higher importance to the most informative temporal features. By focusing on relevant patterns in the historical data, the model reduces noise influence and strengthens its ability to capture meaningful thermal behavior patterns in building environments.

Experimental Setup and Data Collection

High-resolution operational data were collected from a wall-mounted electric radiant heating system used within a controlled indoor environment. The dataset included detailed temporal records of indoor thermal conditions, allowing the model to learn the relationship between system operation and operative temperature variations. Comparative experiments were conducted to evaluate the performance of the proposed hybrid model against several widely used machine learning algorithms.

Model Performance and Implications for Building Energy Management

The Hybrid LSTM–Transformer model demonstrated superior predictive accuracy compared with benchmark models such as LSTM, Artificial Neural Networks, Decision Trees, Extreme Gradient Boosting, and Random Forests. The proposed model achieved an R² value above 0.87 for predicting operative temperature at a 60-minute forecasting horizon. The results confirm that integrating sequential learning, attention mechanisms, and hybrid deep learning structures can significantly improve indoor thermal prediction, supporting more intelligent HVAC control and energy-efficient building operations.

IndoorClimateModeling
DeepLearningApplications
TimeSeriesPrediction
BuildingAutomation
EnergyForecasting
RadiantHeatingSystems
ArtificialIntelligenceInBuildings
SustainableBuildings
IntelligentHVAC

Comments

Popular posts from this blog

🌟 Best Architectural Design Award – Nominations Now Open! 🌟

🚆🤖 Deep Learning Model Wins for Train Ride Quality! 🎉🧠

👁️🌿 How Eye Tracking is Revolutionizing Landscape Design Education! 🎓✨