Multi Display Environments (MDEs) [REK99] combine several physical displays into one virtual screen and input space. Typically, these systems are built by tiling many monitors onto each other in a fixed spatial configuration or combining projectors and cameras in a dynamic working environment [PWS09].
We want to built a dynamic tabletop like MDE solely using handheld devices. The goal of this specific project is given two (or more) handheld devices (e.g., smartphones, tablets) to determine the local x- and y- offsets between the devices with local sensors only.
If you are interested in the project you should:
- Locally sense the offset. Only use built-in sensors like magnetometer, front-facing cameras or ones that are attached at the devices. Especially, Do not use external equipment.
- Communicate the offset to the device. If additional sensors are attached to the device, communicate the presence (or absence) of other devices and their x-, and y-offset of to it, either through Bluetooth or WiFi.
- Support different device types. For example combine a smartphone and a tablet.
Approaches to address these challenges include but are not limited to exploiting additional external sensors like IR [BIH08] or leveraging internal sensors like magnetic field sensors [KRY10] [ABW11].
To reach this goal you should have:
- Basic signal-processing or computer vision knowledge
- Fun at physically assembling small components
- Ideally, experience with the Arduino platform
We will support you with:
- Arduino microcontrollers and sensors that can communicate with Android devices over bluetooth and iOS devices via the web
- Android or iOS devices
- Regularly feedback and support by your supervisor
[ABW11] Daniel Ashbrook, Patrick Baudisch, and Sean White. 2011. Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring. In Proceedings of the 2011 annual conference on Human factors in computing systems (CHI ’11). ACM, New York, NY, USA, 2043-2046. DOI=10.1145/1978942.1979238 http://doi.acm.org/10.1145/1978942.1979238
[BIH08] Alex Butler, Shahram Izadi, and Steve Hodges. 2008. SideSight: multi-“touch” interaction around small devices. In Proceedings of the 21st annual ACM symposium on User interface software and technology (UIST ’08). ACM, New York, NY, USA, 201-204. DOI=10.1145/1449715.1449746 http://doi.acm.org/10.1145/1449715.1449746
[KRY10] Hamed Ketabdar, Mehran Roshandel, and Kamer Ali Yüksel. 2010. Towards using embedded magnetic field sensor for around mobile device 3D interaction. In Proceedings of the 12th international conference on Human computer interaction with mobile devices and services (MobileHCI ’10). ACM, New York, NY, USA, 153-156. DOI=10.1145/1851600.1851626 http://doi.acm.org/10.1145/1851600.1851626
[PWS09] Pirchheim, Christian; Waldner, Manuela; Schmalstieg, Dieter; , “Deskotheque: Improved Spatial Awareness in Multi-Display Environments,” Virtual Reality Conference, 2009. VR 2009. IEEE , vol., no., pp.123-126, 14-18 March 2009 doi: 10.1109/VR.2009.4811010 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber;=4811010&isnumber;=4810973
[REK99] Jun Rekimoto and Masanori Saitoh. 1999. Augmented surfaces: a spatially continuous work space for hybrid computing environments. In Proceedings of the SIGCHI conference on Human factors in computing systems: the CHI is the limit (CHI ’99). ACM, New York, NY, USA, 378-385. DOI=10.1145/302979.303113 http://doi.acm.org/10.1145/302979.303113