Growing up in the San Francisco Bay Area, you encounter two kinds of weather. Pleasant and foggy. Foggy is the best. “Thick as pea soup,” it used to be called, and not just when it rolled over Ocean Beach and advanced like a slow-motion wall over the Sunset and Outer Richmond and Golden Gate Park. It’s the best. We’d slow down, turn off the high beams, and take it real slow. But how do autonomous vehicles see through fog?
MIT has an idea.
From Engadget:
Most autonomous vehicles use LIDAR sensors and/or cameras to figure out where they are on the road, but cameras can be thrown off by lighting conditions or snow-covered signs and lane markings, and LIDAR often becomes less accurate in inclement weather. GPR, on the other hand, sends electromagnetic pulses into the ground to measure the specific combination of soil, rocks and roots. That data is turned into a map for self-driving vehicles.
The system, which uses a type of GPR called Localizing Ground Penetrating Radar developed at the MIT Lincoln Laboratory, offers a few benefits. For starters, it doesn’t matter if the road is snow-covered or if visibility is blocked by fog. And conditions under the road tend to change less often than features like lane striping and signage.
No comments:
Post a Comment