Tesla, which is known for its intelligence, is often unbelievable, and the things displayed on the large visual central control screen even make the driver's creeps.
When a foreign Tesla owner passed a cemetery in broad daylight, Central Control suddenly prompted: Someone was passing by nearby! A Tesla user in Xiamen drove into the tunnel. When there were no other vehicles nearby, the big screen warned that there was
a bus next to him, which lasted for ten minutes. Why is this so? Is there a possible solution? It's hard to say, but there are some practices for reference.
Ghost OR May be radar noise:
Specializes in radar technology in San Diego Jacobs School of Electrical Engineering, University of California and computer engineering professor DineshBhar adi A said: "The radar images often appear random point does not belong to any object radar can also receive so-called echo signal, that is. The reflection of radio waves is not directly reflected from the object being detected."
The so-called radar target noise means that the constantly moving radar detection target will cause irregular changes in the measurement parameters, which are similar to the statistical characteristics of general noise. This explains the "random points that do not belong to any object" that the radar may appear, but it is unlikely to form human figures or bus images, perhaps due to deviations in the data processing and calculation process. To untie the bell, you still have to tie the bell, and Tesla has to find the reason for it.
Dual radar may help Tesla:
Andy Herron hated Musk is a laser radar , it does not matter, you can not even replace radar laser radar. Electrical engineers at the University of California, San Diego have developed an intelligent method to improve the imaging capabilities of existing radar sensors so that they can accurately detect the shape and size of objects in the scene. In particular, the system works well when tested at night and in foggy weather , which helps self-driving cars to drive safely in inclement weather.
Severe weather conditions pose challenges for autonomous vehicles. Some vehicles rely on technologies such as lidar and radar to "see" and navigate, but each type of sensor has its disadvantages. The working principle of lidar is to reflect the laser beam to the surrounding objects. It can draw high-resolution 3D images on sunny days, but turns a blind eye to fog, dust, rain and snow. The radar that emits radio waves can "see" clearly in any weather, but it can only capture part of the image of the road scene.
The new radar is similar to lidar. Bharadia pointed out: "This is an inexpensive way to realize severe weather perception on autonomous vehicles. Lidar and radar can also be integrated with our technology, but radar is cheaper. In this way, we don’t need to use expensive lasers. It's on the radar." It is estimated that Musk will like it.
According to reports, the system consists of two radar sensors, both installed on the hood, with an average distance of the width of a car (1.5 meters). Two radar sensors arranged in this way are critical, allowing the system to see more space and details than a single radar sensor. Isn't this very much like a binocular camera? Use the parallax changes of the two cameras to accurately determine the distance. It's just that the sensors are different and the spacing is larger.
The multi-radar system predicts the vehicle size in real time (the red box is the prediction and the blue box is the real measurement). During the test drive in clear day and night, the system is equivalent to the performance of lidar in determining the size of moving vehicles in traffic. But in the experiment of simulating heavy fog weather, its performance did not change. The research team used the fog machine to "hide" another car, and the system accurately predicted the 3D geometry of the preceding car; the lidar sensor basically failed the test.
New radar system with improved imaging capabilities accurately predicts the size of moving cars in fog.
Two eyes are stronger than one eye:
The reason for the poor imaging quality of traditional radar is that when radio waves are reflected from an object, only a small part of the signal is reflected back to the sensor. Therefore, vehicles, pedestrians, and other objects appear as a sparse set of dots.
"This is the problem of using a single radar imaging. It only receives a few points to represent the scene, so the perception is very poor." Kshi ti z Bansal, a PhD student in Computer Science and Engineering at the University of California, San Diego, said: "In this environment, you You may not see other cars.” Therefore, if a single radar causes such blindness, the setting of multiple radars will improve perception by increasing the number of points reflected back. "
The research team found that two radar sensors 1.5 meters apart on the hood of a car are the best arrangement. Bansal said: "By deploying two radars at different vantage points and detecting in the overlapping field of view, we have formed a high-resolution area, which makes it easy to detect existing objects." This is not the width of a car. ? Why not integrate a small radar in the width indicator or reflector on both sides?
However, more radar means more noise. Therefore, the research team developed a new algorithm that can fuse information from two different radar sensors to generate a new image without noise. The team built the first dataset, combining data from two radars.
Bharadia said: "There is no public data set that can provide this kind of data. These data come from multiple radars with overlapping fields of view. We collected our own data and built our own data set for training algorithms and testing."
The data set consists of 54,000 radar frames, including real-time traffic and simulated day and night driving scenes under heavy fog conditions. Future work will include collecting more data in the rain. To do this, the team first needs to build a better protective cover for their hardware.
The team is currently working with Toyota to integrate the new radar technology with the camera. The researchers said: "This may replace lidar, and radar alone cannot tell us the color, brand or model of a car, but these features are also important for improving the perception of autonomous vehicles," Bharadia said.
The working principle of the lidar sensor is to emit a large number of narrow near- infrared beams. These beams have a circular/elliptical cross-section, which can reflect the trajectory of the object and return to the detector of the lidar sensor.
One of the problems with lidar sensors is their performance degradation in rain. If the lidar beam intersects with raindrops within a short distance from the transmitter, the raindrops can reflect enough of the beam back to the receiver and therefore mistake it for an object. Water droplets can also absorb some of the emitted light, reducing the performance range of the sensor.
Researchers from the Smart Car Group of Warwick University (WMG) specifically simulated and evaluated the performance of lidar in the rain. They used the WMG3xD simulator to test the lidar of a self-driving car in rain of different intensities, and drove on the surrounding simulated roads. The simulator is a key part of testing self-driving cars, which is equivalent to driving on millions of miles of roads, so this means it can be tested in the same safe environment as real roads.
Researchers in the WMG3xD simulator that tested the lidar used different probabilistic rainfall models to measure the lidar's response to rain in the rain, and recorded false positive and false negative results. They found that as the rainfall intensity increases, it becomes more difficult for the sensors to detect targets. Within a short distance (50 meters) from the vehicle, a few drops of rain were erroneously detected. However, in the medium range (50-100 meters), the error is reduced, but as the rainfall increases to 50 mm per hour, as the distance increases, the sensor's ability to detect objects gradually weakens.
Dr. Valentina Donzella of WMG said: “We finally confirmed that the heavier the rain and the farther the distance, the greater the impact on the objects detected by the lidar sensor. Can fully detect objects."
If the "binocular radar" mentioned above can be carried out, it may kill two birds with one stone, help Tesla, and solve the shortcomings of vehicle lidar target detection in severe weather conditions!