> They were attempting to pull off AR effects on the transparent OLED windows of the bus without accounting for lens distortion, field of view, parallax, occlusion, etc., and were frustrated and mystified when things didn’t appear to line up. They were completely naive to what depth and scale cues are and how to deploy them.
A. You need to know where a passager's eyes are to display the POI in the right place. Even if each rows gets their own and only screen you'll need to account for their head vertical position (different people are different height) and movement, hence the eye tracking.
B. If you share a window between multiple people you end us with a POI mess with informations displayed multiplied by as much passengers in the bus.
|- r9 -|w9
| |w9
|- r10 -|w10 T
| |w10
|- r11 -|w11
| |w11
|- r12 -|w12
IMHO the only practical way is with personal headsets like [0] but then you don't need a bus: just use your foot or any transportation: it's AR and not VR.