Abstract
Autonomous vehicles are part of an expanding industry that encompasses various interdisciplinary fields such as dynamic controls, thermal engineering, sensors, data processing, and artificial intelligence. Exposure to extreme environmental conditions, such as changes to temperature and humidity, affects sensor performance. To address potential safety concerns related to sensor perception used in autonomous vehicles in extremely cold real-world situations, specifically Alaska, examination of frosts and water droplets impact on vehicle optical sensors is conducted in both real-world and laboratory-controlled settings. Machine learning models are utilized to determine the vision impediment levels. Potential hardware and software tools are then introduced as solutions for the environmental impacts. Through this research, a better understanding of the potential caveats and algorithm solutions can be suggested to improve autonomous driving, even under challenging weather conditions.
1 Introduction
1.1 Sensors for Autonomous Vehicles.
Innovations in sensor technologies have enabled previously isolated systems to visualize external objects and to interact with their surroundings. Specifically, autonomous vehicles utilize various sensors—radio detection (RADAR), light detection (LiDAR), and visible light (VIS)—with different monitoring fields of view (FoV), distance quantifications, and dimensional space information to view and interact with the surroundings in real-time.
However, variations in environmental conditions may result in artifacts which impede sensor view, disrupt signal, and decrease sensor lifetime. For a completely self-driving car to be realized, addressing environmental condition effects on sensors is necessary. Figure 1 illustrates the process taken for an autonomous vehicle to execute an action and the effects of various environmental factors on the sensor fields of view as simulated in a computer-based environment [1–4]. The RADAR FoV remains the same, whereas LiDAR and VIS views decrease by factors of ∼1.5 and ∼2 respectively [2]. The light waves required by LiDAR and VIS are more susceptible to degradation. However, VIS remains the most popular form of sensor vision due to the high resolution to byte, thereby allowing for faster image processing, which is required in real-time applications. Most of the current solutions for improving sensor vision in autonomous vehicles aim to prevent obstruction on the sensor lenses through the use of passive hydrophobic coatings or active cleaning systems. Other solutions involve improving situational awareness through machine learning [3–8].

Schematic of the general autonomous vehicle implementation process and corresponding effect of environmental factors on sensor perception
Autonomous vehicles currently employ machine learning models for rapid and accurate extrapolation of high-level features, (e.g., cars, people, and traffic signals) from low-level features (e.g., curves, edges, and pixels). These models use computer vision networks, such as convolution, region based (RCNN), Mask RCNN, Fast RCNN, or You Only Look Once (YOLO) for object classification and detection [8–18]. Several research [15–24] approaches have been made toward correcting sensor errors in the form of learning based (LB) framework [15] and improving computer sensor communications [17]. Studies have shown human-vehicle teaching-and-learning framework increase vehicles action flexibility in situations involving distance visualization errors in sensing system by training on reactions from people [15–17]. Other research directions focus on postprocessing the image data to remove artifacts via generative adversarial networks (GANs), which weigh discrimination values against common patterns to determine areas of interest [23].
Overall, this study seeks to quantify real-world conditions in cold environments and to examine the corresponding effects seen in vehicles equipped with optical sensors to provide insights into possible areas for sensor improvements in terms of hardware and software.
2 Experimental Design and Analysis
Condensation and frosting are seen on glassy surfaces at low temperature (below 10 °C) and high humidity (above 45%) [25]. Water impacts transmission wavelengths via absorption, attenuation, or backscattering, leading to obstructions in sensor prospection and perceiving false artifacts or image blurring. Despite the significance placed on sensor perception for autonomous driving, most research studies explore impacts in simulated computer-based environments [1–23,26]. To address the research area focused on real-world conditions, we examine vehicle conditions experience in Alaska and mimic these conditions in a laboratory setting for image collection consistency. Table 1 summarizes the various methodologies that are tested.
Experimental design summary for the different case studies reported
Case study | Experimental summary | |
---|---|---|
1 | Real-world environments Alaska | Monitoring real-world temperature and humidity conditions experienced by a vehicle in different states: |
(a) outdoor parking, driving | ||
(b) indoor parking, driving, soaking | ||
2 | Controlled frost | Mimicking Case 1 results in a laboratory-controlled environments for analysis of frosting impact on images and suggesting potential hardware mitigation tools: |
(a) Quantification of frosting effects on sensor perception and YOLO detection and classification | ||
(b) Effects of heated glass on image clarity and frost removal | ||
3 | Controlled droplets | Analysis of potential machine learning mitigation tool, GANs-restoration |
Case study | Experimental summary | |
---|---|---|
1 | Real-world environments Alaska | Monitoring real-world temperature and humidity conditions experienced by a vehicle in different states: |
(a) outdoor parking, driving | ||
(b) indoor parking, driving, soaking | ||
2 | Controlled frost | Mimicking Case 1 results in a laboratory-controlled environments for analysis of frosting impact on images and suggesting potential hardware mitigation tools: |
(a) Quantification of frosting effects on sensor perception and YOLO detection and classification | ||
(b) Effects of heated glass on image clarity and frost removal | ||
3 | Controlled droplets | Analysis of potential machine learning mitigation tool, GANs-restoration |
2.1 Case 1. Real-World Environments in Alaska.
We equip a vehicle with K-type thermocouple and humidity sensors as detailed in Fig. 2. Briefly, the sensors are located: (1) in the camera housing, (2) on the front windshield, (3) at the rear of the car, (4) on the left side of the car, (5) on the right side of the car, (6) in the front of the car, and (7) inside the car cabin, which humidity sensors at locations (6) and (7). There are two parking conditions—(a) outdoor parking (0 °C or below) and (b) indoor parking (∼20 °C)—based on real-world situations. Driving condition refers to the vehicles in motion at driving speeds of 30–40 km/hr with external temperatures of −21 °C. Cases 1a and 1b are differentiated by outdoor parking (external conditions are approximately −21 °C) and indoor parking (external environment reports temperatures of ∼25 °C).

Schematic displaying the location of seven thermocouples and two humidity sensors used during the Alaska testing
2.2 Case 2. Controlled Frosts.
Case 2 mimics Case 1 driving conditions in a laboratory setting for data collection (Case 2a) and reports on potential hardware to improve sensor perception (Case 2b).
Case 2a examines multifunctional camera (MFC) reference images containing two humanoid figures, a chair, and a bike taken under controlled environmental conditions that follow Case 1b drive. Ten images are taken at each time-step. This condition has the most environmental variation. The images are examined using blur and precision values based on YOLO pretrained models (YOLO v5 PyTorch) [22,27,28].
Case 2b investigates an active heated glass element as a potential mitigation tool to decrease the blurring effects seen in Case 2a. The glass follows Case 1b conditions and is placed in front of the MFC, facing a checkerboard pattern. Temperature changes from the heated element are examined for image quality. The glass is frosted before each the temperature tests. A temperature of 0 °C indicates that no power is applied to heat the glass.
2.3 Case 3. Controlled Droplets.
Case 3 focuses on the quantification of GANs for removing water droplets. The water is applied to a glass slide via a misting pump, and the slide is placed in front of a MFC that is facing a parking lot for a variety of real-world cars. The images are placed in a GANs model, which provides a mask image that separates the artifact affect areas and nonaffected areas as black and white pixels and a restored image. The pixel histograms from the masks are examined for affected percentages. The original and restored images undergo YOLO classification to compare average object count and classification probability levels, the confidence in which the classified object resembles a car. To the best of our knowledge, this is a new report on incorporation of a GANs image restoration with YOLO classification for implications in autonomous vehicle application.
3 Results and Discussion for Cases 1 to 2
3.1 Case 1. Real-World Environments in Alaska.
Examining the temperature and humidity changes of the vehicle allows for correlation with frosting or condensation events on the glassy surfaces, such as a car windshield or camera lenses. The frost on the windows and camera lenses can be physically removed before the car is driven; however, during driving states, condensation, and frost that build up are difficult to remove. The warm interior cabin and the cold external environment present a temperature gradient which leads to frosting seen during the driving states.
The monitored results from Case 1a are displayed in Fig. 3(a), where the MFC experiences overall temperature rates of ∼0.38 °C/min, while the windshield temperature changes at ∼0.34 °C/min. Considering only the driving conditions, temperature rates are ∼0.44 °C/min and ∼ 0.24 °C/min for the MFC and windshield, respectively. The faster increase in camera temperature is due to the added heat from powering the sensor. Although the window defrosters remain on, the MFC captured images are blurry, indicating water is impeding visualization. Figure 3(b) displays the temperature and humidity changes as detailed in Case 1b. During driving (yellow), the temperature drops at a rate of approximately ∼−4 °C/min, which is attributed to the vehicle exiting the warm garage and entering the cold environment, also resulting in blurry MFC images.

Case 1: Real-world environments in Alaska (a) MFC (location 1 in Fig. 2) and the windshield (location 2) temperature over time for Case 1a. (b) Temperature (left y-axis) and humidity (right y-axis) over time for Case 1b.

Case 1: Real-world environments in Alaska (a) MFC (location 1 in Fig. 2) and the windshield (location 2) temperature over time for Case 1a. (b) Temperature (left y-axis) and humidity (right y-axis) over time for Case 1b.
3.2 Case 2a. Quantification of Frosting.
The frosting and condensation effects on the imaging are investigated in a controlled environment, as shown in Fig. 4. The images are quantified using two metrics—blur and precision—that are based on the pixel value changes. Blur is calculated via comparison of pixel intensities between the neighboring pixel maps in a reference image (time = 0) to an input image [29–31]. Blur increases over time in correlation with frosting of the camera lens. Blur values larger than 0.5 indicate over half in input image deviates from the reference. Precision compares the YOLO classification and detection similarity between reference and input images, in which correct object counts and correct classifications are considered. A maximum value of 2 indicates that all objects are detected, and each detection is classified correctly.

Case 2a. (a) MFC images overtime in the environmental chamber. The scale bar is 0.6 m. (b) Blur quantification of frosting over time. (c) Precision over time. The standard deviation (gray) is present for (b) and (c). (d) The warning activation over time. (e)Temperature in the camera, on the lens, and in the environmental chamber over time. Temperature reading error is ±0.6 °C. Insets show image of the camera frost on the camera lenses. The humidity is maintained between 45% and 55%.

Case 2a. (a) MFC images overtime in the environmental chamber. The scale bar is 0.6 m. (b) Blur quantification of frosting over time. (c) Precision over time. The standard deviation (gray) is present for (b) and (c). (d) The warning activation over time. (e)Temperature in the camera, on the lens, and in the environmental chamber over time. Temperature reading error is ±0.6 °C. Insets show image of the camera frost on the camera lenses. The humidity is maintained between 45% and 55%.
At 0 min (reference point) and 10 min (after the camera is placed in the chamber), all objects are correctly classified; however, not all of objects are detected. The precision decreases drastically at 20 min, when identification and detections provide false negatives and true negatives. A warning system is introduced to test real-time application of the blur and precision calculations in which ON (1) activates the alarm to warn the driver of sensor perception decreases. This alarm is meant to work in tandem with YOLO precision calculations for real-time sensor monitoring. Camera temperatures are also monitored during the frosting experiment to monitor the temperature gradient for comparison with Case 1 reported values.
3.3 Case 2b. Heated Glass Effect on Frost.
Vehicles are equipped with defrosters to prevent frost and condensation. However, the air heating systems have limitations. We explore an active heated glass element for lenses or windshields to remove frost efficiently and in a localized area. Figure 5(a) showcases the images with the embedded heating element at temperatures of 20 °C, 50 °C, and 70 °C. As seen in Fig. 5(b), increasing the heater temperature decreases blur, indicating an enhancement in clarity. The point 0 °C is the initial point before the heater is on in which the glass is completely frosted. Low temperatures between 20 °C and 30 °C show an increase in blurring, which can be attributed to the slow or incomplete melting, leading to areas of ice and water with different diffraction indexes. At temperature above 30 °C, the liquid can roll off the glass or form a uniform thin film layer which improves visualization. At temperatures above 50 °C, blur values are less than 0.2, indicating clear images within the YOLO framework.

Case 2b: (a) Images from the MFC at different heater temperatures and (b) blur quantification at different heater temperatures with error of ±0.06 °C
4 Implementation of Machine Learning Models
4.1 Case 3. Controlled Droplets.
Images containing varying droplets densities (light, medium 1, medium 2, heavy) are fed through a GANs image restoration model. From the original images (Fig. 6(a)), the GANs model creates a mask (Fig. 6(b)) depicting the artifact affected areas which is then used for value quantification (Fig. 6(c)). YOLO is utilized to compare object detection and classification between the original (Fig. 6(e)) and GANs-restored images (Fig. 6(f)). Pixels over 50 represent the artifact affected areas, and are used to determine the affected percentage, which is the affected pixel count to the total pixel count. Pixel count of values over 50 and the affected percent show increasing trends affecting over 90% of the image (Fig. 7(a)). YOLO classification probability, an assigned value based on training models that detected object is correctly identified, and object count is seen to increase when GANs-restored images are used as inputs (Fig. 7(b)). For affected percentages of 25%, the probability difference between the original image and the restored image is less than a 2% difference. Between 35% and 50%, restored images probabilities increase by a maximum of 11%. Above 90%, the probability decreases to less than a 1% difference between the original and restored images. The improvement in probability steadily increases for low affected percentages. The classification probability (Fig. 7(b) left y-axis, markers) for original images remains consistent between 0.45 and 0.56, compared to the range of GANs restored, which is between 0.41–0.64. Although objects counted (Fig. 7(b) right y-axis, bar graph) decrease with increasing artifact levels, the GANs restored images show higher object counts than the original. GANs restored images show improvements in water obstructed conditions captured by optical cameras; however, more optimization is required for real-time quantification of artifacts. There is a decrease in probability for both cases in the heavy corruption images.

Case 3: (a) Original images, (b) resulting GANs-mask depicting affected pixels, (c) pixel histogram from GANs-mask, and (d) GANs-restored images. YOLO classification of (e) original and (f) GANs restored images.

(a) Pixel count (left) and affected area percent (right) (b)probability (markers, left y-axis) and object count (bars, right y-axis) between the original (circle, left) images and the GAN-restored (square, right) images from YOLO classification
5 Conclusion
Real-world conditions in Alaska are studied to examine the impact of cold and humid environmental conditions on optical images for autonomous vehicles. Various real-world conditions are monitors in Case 1, which is then used in Case 2 for laboratory-controlled studies. Optical images are quantified with YOLO to determine frosting impacts in terms of blur and precision. Weather conditions are seen to impede object detection up to 50% via blurring effects or introduction of artifacts.
Preventive measures are addressed in the form of hardware improvements (Case 2b) and machine learning implementation (Case 3). The heating element showcases potential in real-world applications by improving clarity by decreasing high blur values above 0.5 down to 0.02. Additional software improvements by GANs image restoration serves to improve YOLO detection probability at low affected percentages (less than 30% image area detection of artifacts) by 11% and at the high corruption levels (above 50% image area detection of artifacts). Although the GANs-restored images require more extensive testing and training before implementation in real-world applications, the study shows a potential for merging the machine learning networks together into one streamlined system for improved sensor visibility in the case of artifact obstruction on the lenses face.
Funding Data
Mechanical and Aerospace Engineering Department at UCI and the Hyundai Motors Company.
Data Availability Statement
The authors attest that all data for this study are included in the paper.
Nomenclature
- AI =
artificial intelligence
- AV =
autonomous vehicles
- CNN =
convoluted neutral metwork
- FoV =
field of view
- GAN =
generative adversarial network
- LB =
learning based
- LiDAR =
light detection and ranging
- MFC =
multifunctional camera
- RADAR =
radio detection and ranging
- RCNN =
region based convoluted neutral network
- VIS =
visible light sensors
- YOLO =
you only look once object detection