Letting one or multiple users interact with situated displays through handheld devices is compelling for public-private display interaction or tabletop collaboration. So far interactive mobile systems have been limited to only basic interaction techniques on self-contained mobile devices, or have required considerable infrastructure for large screen interaction, making them impractical outside the lab. Furthermore, research has indicated that users expect handheld AR devices to act as a transparent frames which current generation AR applications on smartphones do not deliver (they instead render the scene from the point of view of the device camera). Geometrically correct rendering or user perspective rendering allows users to integrate information both from the physical context and from the handheld device. This can be specifically beneficial for map information depicted on large public screens.
We envision mobile handheld systems with user perspective rendering that can augment public displays without the need for extensive infrastructure.
We developed a first prototype that allows user-perspective magic lenses for situated displays, showing dynamically changing content at interactive frame rates without the need for additional infrastructure. The system only requires access to a screencast of the situated display, which can be easily provided through common streaming platforms and is otherwise self-contained. Our system performs all computations on the mobile device. Hence, it easily scales to multiple users.