The Waymo Open Dataset contains data collected over the course of the millions of miles Waymo’s cars have driven in Phoenix, Kirkland, Mountain View, and San Francisco, and it covers a wide variety of urban and suburban environments during day and night, dawn and dusk, and sunshine and rain. Samples are divided into 1,000 driving segments, each of which captures 20 seconds of continuous driving — corresponding to 200,000 frames at 10 Hz — through the sensors affixed to every Waymo car. These include five custom-designed lidars (which bounce light off of objects to map them three-dimensionally) and five front- and side-facing cameras.
The corpus additionally includes labeled lidar frames and images with vehicles, pedestrians, cyclists, and signage, capturing a total of 12 million 3D labels and 1.2 million 2D annotations. Waymo says the camera and lidar frames have been synchronized by its in-house 3D perception models that fuse data from multiple sources, obviating the need for manual alignment.
Read more here