Data Driven Prediction of Vehicle Cabin Thermal Comfort Using Machine Learning and High Fidelity Simulation Results
Data Driven Prediction of Vehicle Cabin Thermal Comfort Using Machine Learning and High Fidelity Simulation Results
a r t i c l e i n f o a b s t r a c t
Article history: Predicting thermal comfort in an automotive vehicle cabin’s highly asymmetric and dynamic thermal
Received 15 July 2019 environment is critical for developing energy efficient heating, ventilation and air conditioning (HVAC)
Revised 30 October 2019
systems. In this study we have coupled high-fidelity Computational Fluid Dynamics (CFD) simulations and
Accepted 19 November 2019
machine learning algorithms to predict vehicle occupant thermal comfort for any combination of glazing
Available online 5 December 2019
properties for any window surface, environmental conditions and HVAC settings (flow-rate and discharge
air temperature). A vehicle cabin CFD model, validated against climatic wind tunnel measurements, was
used to systematically generate training data that spanned the entire range of boundary conditions, which
impact occupant thermal comfort. Three machine learning algorithms: linear regression with stochastic
gradient descent, random forests and artificial neural networks (ANN) were applied to the simulation data
to predict the Equivalent Homogeneous Temperature (EHT) for each passenger and the volume averaged
cabin air temperature. The trained machine learning models were tested on unseen data also generated
by the CFD model. Our best machine learning model was able to achieve a test error of less than 5%
in predicting EHT and cabin air temperature. Predicted EHT can also yield thermal comfort metrics such
as Predicted Mean Vote (PMV) and Predicted Percentage of Dissatisfied (PPD), which can account for
different passenger profiles (metabolic rates and clothing levels). Machine learning models developed in
this work enable predictions of thermal comfort for any combination of boundary conditions in real-time
without having to rely on computationally expensive CFD simulations.
© 2019 Elsevier Ltd. All rights reserved.
https://doi.org/10.1016/j.ijheatmasstransfer.2019.119083
0017-9310/© 2019 Elsevier Ltd. All rights reserved.
2 A. Warey, S. Kaushik and B. Khalighi et al. / International Journal of Heat and Mass Transfer 148 (2020) 119083
Fig. 1. Schematics for total body heat balance in a vehicle cabin and a uniform environment.
Table 1 Table 2
PMV Sensation Scale [13]. Air flow distribution.
-3 Cold Summer
-2 Cool Center 40
-1 Slightly Cool Left 20
0 Neutral Right 20
1 Slightly Warm Console 20
2 Warm Winter
3 Hot Defrost 15
Front left heater 30
Front right heater 30
Rear left heater 12.5
Rear right heater 12.5
Appropriate thermo-physical properties, were imposed on each prescribed as heat transfer coefficients and ambient air tempera-
wall. In addition, thermal radiation properties such as emissiv- ture, reflecting the speed of the vehicle.
ity, reflectivity and transmissivity for each wall were prescribed. The manikin’s surfaces were segmented into 16 body parts.
Since solar radiation plays a significant role in the overall heat Each of these parts was covered with clothing of constant insu-
balance inside the passenger compartment, glazing properties for lation, specified in units of CLO. Light summer clothing, for ex-
each window surface were considered in the data generation. ample, had an average overall insulation value of 0.6 CLO. Higher
An external convective boundary condition for these walls, was level of clothing provides less sensitivity to surrounding thermal
A. Warey, S. Kaushik and B. Khalighi et al. / International Journal of Heat and Mass Transfer 148 (2020) 119083 5
Table 3
Climatic wind tunnel boundary conditions.
Solar Load (W/m2 ) Total HVAC Flow Rate (CFM) Discharge Air Temperature (°C) Ambient Temperature (°C) Vehicle Speed (kph)
tunnel test. The volume averaged cabin air temperature from the
CFD model matched the target cabin air temperature of 24 °C as
measured in the tunnel test, which resulted in a UA value of 35
W/mK (same as wind tunnel data) as shown in Fig. 8. Overall EHT
values for the ‘virtual’ thermal manikins were validated with those
obtained from a physical manikin in the climatic wind tunnel tests
under steady-state boundary conditions.
Additional validation under typical summer conditions was car-
ried out with data from a test vehicle. Under the same boundary
conditions, the volume averaged cabin air temperature calculated
in the CFD model matched experimental measurements within
±1 °C.
Table 4
Features included in the training data.
Table 5
Hyperparameter optimization of the linear regression model.
Table 6
Hyperparameter optimization of the random forest model
values chosen for the linear regression model are also shown in of the trees and maximum leaf nodes for the trees were tuned
Table 5. by conducting a randomized search over the hyperparameter space
shown in Table 6 with 6-fold cross-validation (CV) over the train-
3.2. Random forests ing dataset. The final hyperparameter values chosen for the ran-
dom forest model are also shown in Table 6.
Random forests are an ensemble learning method that operates
by fitting a number of decision trees on various sub-samples of the 3.3. Artificial neural network
training dataset [19]. It was implemented with the machine learn-
ing library Scikit-learn in Python [17]. Although random forests A fully connected feedforward neural network as shown in
natively support multi-output regression, a multi-target regression Fig. 9 was developed to predict the EHT for each passenger and
strategy of fitting one random forest per target was used. This volume averaged cabin air temperature. In a feedforward network,
strategy resulted in better performance during cross-validation. the first layer is the input and the last layer is the output, in be-
The hyperparameters such as number of estimators, maximum tween are the hidden layers. The units in hidden layer are termed
number of features to consider for the best split, maximum depth “neurons”. The ANN can be regarded as a process where the input
8 A. Warey, S. Kaushik and B. Khalighi et al. / International Journal of Heat and Mass Transfer 148 (2020) 119083
Neurons in the first hidden layer [8, 16, 32, 64, 128, 256] 256
Additional hidden layers [0, 1, 2] 1
Neurons in the additional hidden layers [8, 16, 32, 64, 128, 256] 64
Activation [25,26] [ReLU, ELU] ELU
Optimizers [16] [Adam, RMSprop, SGD] Adam
Dropout after each hidden layer [27] [0, 0.1, 0.2, 0.3, 0.4, 0.5] 0.2
Batch size [1, 4, 8, 16, 32] 8
Learning rate [0.001, 0.01, 0.1] 0.001
Table 8
Summary statistics of machine learning models over 10 cross-validation folds.
EHT: Front Left EHT: Front Right EHT: Rear Left EHT: Rear Right Cabin Air Temperature
features go through a series of nonlinear transformations to predict where m is the number of samples, yˆi is the predicted value and
the outputs [20]. In this work, the open source deep learning li- yi is the actual or observed value.
brary Keras [21,22] with TensorFlow [23] as the backend was used
for constructing, training and applying the ANNs. All input features 3.4. Model evaluation
were zero centered and normalized by the mean and standard de-
viation of the training dataset. The machine learning models were evaluated on the full set of
The neural network architecture and hyperparameters were op- training data using 10-fold cross- validation (CV), indicating how
timized using Talos the open source hyperparameter optimiza- well the network generalized to unseen data. Table 8 shows the
tion solution for Keras [24]. 20% of the training data was used summary statistics for the three models over the 10 folds. Mean
for validation during optimization. The network architecture pa- absolute error for each fold was defined as:
1
rameters considered in the optimization are shown in Table 7. m
Over 50 0 0 network hyperparameter combinations were evaluated Mean Absolute Error (MAE ) = yˆi − yi (5)
by conducting a randomized search through the entire parameter
m
i=1
space over several iterations. At each iteration the parameter space Even with hyperparameter tuning, linear regression with
was shrunk based on results from the previous iteration. The final stochastic gradient descent performed the worst with an average
network configuration also shown in Table 7 was chosen based on mean absolute error over the 10 folds of approximately 4.5 °C.
the loss (Mean Squared Error) on the validation dataset as given Overall, the optimized ANN performed the best among all three
by: machine learning models with an average mean absolute error
1 2
m over the 10 folds of approximately 2.5 °C.
Mean Squared Error (MSE ) = yˆi − yi (4) P-values were generated through Wilcoxon signed-rank test
m
i=1 for the ANN vs. linear regression with SGD and random forests to
A. Warey, S. Kaushik and B. Khalighi et al. / International Journal of Heat and Mass Transfer 148 (2020) 119083 9
Fig. 10. Predicted EHT values (°C) by the ANN vs. EHT values from CFD simulations for all four passengers over the test set of 60 cases.
Table 10
Performance metrics of the ANN on test data.
EHT: Front Left EHT: Front Right EHT: Rear Left EHT: Rear Right Cabin Air Temperature
Table 11
Sunroof glass properties.
Sunroof Glass I
0.22 0.72 0.72 0.22
Sunroof Glass II
0.93 0.02 0.93 0.03
Table 12
Summer boundary conditions.
Solar Load (W/m2 ) Total HVAC Flow Rate (CFM) Ambient Temperature (°C) Vehicle Speed (mph)
1000 200 30 70
Fig. 11. PMV-PPD for all four passengers for the two sunroof glasses under summer boundary conditions.
Table 14
HVAC settings for the two sunroof glasses.
Flow Rate Discharge Air Temperature Predicted Cabin Air Temperature Calculated HVAC Power
CFM °C °C W
Sunroof Glass I
200 4 18.6 2463
Sunroof Glass II
200 9 21.3 1989
Performance metrics of the ANN on unseen data are given networks were applied to the simulation data to predict the equiv-
in Table 10. After training on the entire training dataset, perfor- alent homogeneous temperature (EHT) for each passenger and the
mance of the ANN on the test set improved compared to the volume averaged cabin air temperature.
cross-validation datasets. The optimized ANN was able to achieve The machine learning models were evaluated on the full set of
a RMSE of less than 1.5 °C and a mean absolute percentage error training data using 10-fold cross- validation (CV), indicating how
of less than 5% on the test data. well the network generalized to unseen data. Wilcoxon signed-
Fig. 10 shows the comparison between the predicted EHT values rank test for the ANN vs. linear regression with SGD and random
(°C) by the optimized ANN and EHT values from CFD simulations forests indicated that the ANN was significantly better. The opti-
for all four passengers over the test set of 60 cases. mized ANN was able to achieve a RMSE of less than 1.5 °C and a
mean absolute percentage error of less than 5% on the test data.
The optimized ANN was used to assess the impact of sunroof
4.2. Case study: impact of glazing on HVAC energy consumption
glazing on energy consumption of the HVAC system under typical
summer conditions. The case study highlights the promise of using
The optimized ANN was used to assess the impact of sunroof
data-driven machine learning models to predict thermal comfort
glazing on energy consumption of the HVAC system under steady-
for any combination of boundary conditions in real-time without
state boundary conditions. Glazing properties of the two sunroof
having to rely on computationally expensive CFD simulations.
glasses considered in the study are given in Table 11. Glazing prop-
erties for all other window surfaces in the vehicle were held con- Declaration of Competing Interest
stant.
The two glasses were evaluated under typical summer bound- None.
ary conditions given in Table 12 with the sun directly overhead
(Sun altitude = 90° and Sun Azimuth = 0°) Supplementary materials
All four occupants were assumed to have identical biometrics
given in Table 13 below. Since PMV-PPD calculations are done us- Supplementary material associated with this article can be
ing predictions of EHT, these traits can be easily modified to ac- found, in the online version, at doi:10.1016/j.ijheatmasstransfer.
commodate different passenger profiles. 2019.119083.
To maintain the same level of thermal comfort for all four pas-
sengers as shown in Fig. 11, the HVAC settings (flow rate and References
discharge air temperature) for the two glass types are given in
[1] AAA Newsroom, Icy Temperatures Cut Electric Vehicle Range
Table 14 along with the predicted cabin air temperature and HVAC Nearly in Half, https://www.aaa.com/AAA/common/AAR/files/
power calculated as: AAA- Electric- Vehicle- Range- Testing- Report.pdf, 2019 (accessed 18 June
2019).
HVAC Power = m˙ c p (Tambient − Toutlet ) (8) [2] T. Han, L. Huang, S. Kelly, C. Huizenga, Z. Hui, Virtual Thermal Comfort Engi-
neering, SAE Technical Paper 2001-01-0588, 2001, doi:10.4271/2001-01-0588.
[3] T. Han, L. Huang, A Model for Relating a Thermal Comfort Scale to EHT Comfort
where m˙ is the mass flow rate of conditioned air, cp is the specific
Index, SAE Technical Paper 2004-01-0919, 2004, doi:10.4271/2004- 01- 0919.
heat of air, which was assumed to be 1003.62 J/kgK, Tambient is the [4] T. Han, L. Huang, A Sensitivity Study of Occupant Thermal Comfort in a Cabin
ambient air temperature and Toutlet is the discharge air tempera- Using Virtual Thermal Comfort Engineering, SAE Technical Paper 2005-01-
1509, 2005, doi:10.4271/2005- 01- 1509.
ture into the cabin. To maintain the same level of thermal comfort
[5] T. Han, K. Chen, Assessment of Various Environmental Thermal Loads on Pas-
for all four passengers under identical summer boundary condi- senger Compartment Soak and Cool-down Analyses, SAE Technical Paper 2009-
tions, the discharge air temperature for the low transmissivity glass 01-1148, 2009, 10.4271/2009-01-1148.
(Sunroof Glass II) could be set at 9 °C vs. 4 °C for the clear glass [6] S. Kaushik, K. Chen, T. Han, B. Khalighi, Micro-cooling/heating strategy for en-
ergy efficient HVAC system, SAE Int. J. Mater. Manuf. 4 (1) (2011) 853–863,
(Sunroof Glass I), which resulted in a HVAC power consumption doi:10.4271/2011-01-0644.
reduction of 474 W. [7] S. Kaushik, T. Han, K. Chen, Development of a Virtual Thermal Manikin to
The case study highlights the promise of using data-driven ma- Predict Thermal Sensation in Automobiles, SAE Technical Paper 2012-01-0315,
2012, doi:10.4271/2012-01-0315.
chine learning models to do preliminary design studies of HVAC [8] D. Ghosh, M. Wang, E. Wolfe, K. Chen, S. Kaushik, T. Han, Energy efficient HVAC
systems in real-time without having to rely on computationally ex- system with spot cooling in an automobile - design and CFD analysis, SAE Int.
pensive CFD simulations or wind tunnel tests. J. Passeng. Cars Mech. Syst. 5 (2) (2012) 885–903, doi:10.4271/2012- 01- 0641.
[9] K. Chen, S. Kaushik, T. Han, D. Ghosh, M. Wang, Thermal Comfort Prediction
and Validation in a Realistic Vehicle Thermal Environment, SAE Technical Pa-
per 2012-01-0645, 2012, doi:10.4271/2012-01-0645.
5. Summary
[10] D. Hintea, J. Brusey, E. Gaura, A study on several machine learning methods
for estimating cabin occupant equivalent temperature, in: Proceedings of the
High-fidelity CFD simulations and machine learning algorithms Twelfth International Conference on Informatics in Control, Automation and
Robotics (ICINCO), Colmar, France, 2015, doi:10.4271/2012-01-0645.
were coupled to develop a data-driven tool to predict occupant
[11] P.O. Fanger, Thermal Comfort: Analysis and Applications in Environmental En-
thermal comfort for any combination of glazing properties for gineering, Danish Technical Press, Copenhagen, 1970.
any window surface, environmental conditions and HVAC settings [12] N. Djongyang, R. Tchinda, D. Njomo, Thermal comfort: a review paper, Renew.
(flow-rate and discharge air temperature). Sustain. Energy Rev. 14 (9) (2010) 2626–2640, doi:10.1016/j.rser.2010.07.040.
[13] S. Gilani, M.H. Khan, W. Pao, Thermal comfort analysis of PMV model pre-
Three machine learning algorithms: linear regression with diction in air conditioned and naturally ventilated buildings, Energy Proc. 75
stochastic gradient descent, random forests and artificial neural (2015) 1373–1379, doi:10.1016/j.egypro.2015.07.218.
12 A. Warey, S. Kaushik and B. Khalighi et al. / International Journal of Heat and Mass Transfer 148 (2020) 119083
[14] J. Kiefer, J. Wolfowitz, Stochastic estimation of the maximum of a regression [22] F. Chollet, Deep Learning with Python, 1st ed., Manning Publications Co., Shel-
function, Ann. Math. Stat. 23 (3) (1952) 462–466. ter Island, 2018.
[15] L. Bottou, F.E. Curtis, J. Nocedal, Optimization Methods for Large-Scale Machine [23] M. Abadi, P. Barham, J. Chen, Z. Chen, A. Davis, J. Dean, M. Devin, S. Ghemawat,
Learning, 2018, arXiv preprint arXiv:1606.04838. G. Irving, M. Isard, M. Kudlur, J. Levenberg, R. Monga, S. Moore, D. G. Murray, B.
[16] S. Ruder, An Overview of Gradient Descent Optimization Algorithms, 2017, Steiner, P. Tucker, V. Vasudevan, P. Warden, M. Wicke, Y. Yu, X. Zheng, Tensor-
arXiv preprint arXiv:1609.04747. Flow: a system for large-scale machine learning, in: Proceedings of the Twelfth
[17] F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, USENIX Symposium on Operating Systems Design and Implementation, Savan-
M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, nah, USA, 2016.
D. Cournapeau, M. Brucher, M. Perrot, É. Duchesnay, Scikit-learn: machine [24] M. Kotila, Talos, (2018), GitHub repository, https://github.com/autonomio/talos
learning in python, J. Mach. Learn. Res. 12 (2011) 2825–2830. [25] V. Nair, G.E. Hinton, Rectified linear units improve restricted Boltzmann ma-
[18] A. Ng, Feature Selection, L1 vs. L2 Regularization, and Rotational Invariance, in: chines, in: Proceedings of the Twenty-Seventh International Conference on
Proceedings of the Twenty-First International Conference on Machine Learning, Machine Learning, Haifa, Israel, 2010.
Banff, Canada, 2004, doi:10.1145/1015330.1015435. [26] D. Clevert, T. Unterthiner, S. Hochreiter, Fast and accurate deep network learn-
[19] L. Breiman, Random Forests, Machine Learning 45 (1) (2001) 5–32, doi:10. ing by exponential linear units (ELUs), in: Proceedings of the Fourth Interna-
1023/A:1010933404324. tional Conference on Learning Representations, San Juan, Puerto Rico, 2016.
[20] A. Geron, Hands-On Machine Learning with Scikit-Learn and TensorFlow, 1st [27] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, R. Salakhutdinov, Dropout:
ed., O’Reilly Media Inc., Sebastopol, 2017. a simple way to prevent neural networks from overfitting, J. Mach. Learn. Res.
[21] F. Chollet, Keras, (2015), GitHub repository, https://github.com/keras-team/ 15 (2014) 1929–1958.
keras