From the article:
"Our analysis of road segments in California and Virginia revealed that the number of segments with observed HBEs was 18 times greater than those with reported crashes. While crash data is notoriously sparse — requiring years to observe a single event on some local roads — HBEs provide a continuous stream of data, effectively filling the gaps in the safety map."
So we don't have to wait until an accident actually occurs before we can identify unsafe roads and improve them.
I'd love to see them incorporate visual detection of vehicle crash debris as well. There are two intersections in my area that consistently have crash debris like broken window glass and broken plastic parts and license plates from crashes. I know they are dangerous, but I don't know if autonomous vehicles also know that they are dangerous.
Google/Apple probably collect a massively larger amount of data than those other companies, putting those other companies at a risk of losing future revenue.
Between Google and Apple pretty much every car in the US is monitored.
Where Google/Apple's coverage is quite valuable is for near-real-time speeds for atypical events -- say like yesterday's Super Bowl. But that's not what this blog post is about -- this post is about a well-established pattern that can be identified with historical datasets.
All that to say that vendors sell a wide variety of data products to transportation planners, but just because Google is now entering this niche market doesn't mean they'll be "the best" or even realize what their strengths are.
> It's not a lack of knowledge by Caltrans or Santa Clara County's congestion management agency that is keeping that interchange as-is. Rather, it's the physical constraints of a nearby airport (so no room for flyovers), a nearby river (so probably no tunneling), and surrounding private landowners and train tracks.
The most recent budget estimate is $1bn for any changes to this interchange