Mixed Reality Interaction 2014/2015

Introduction

Students from Salzburg University of Applied Sciences can find up-to-date information here.

The course Mixed Reality Interaction at Salzburg University of Applied Sciences will enable students to familiarize themselves with advanced concepts in the fields of Augmented Reality, Virtual Reality, Computer Vision and Mobile Interaction. They will be able to implement advanced registration and tracking algorithms and novel interaction methods.

Course Content

The interactive lectures will take place in three one-day blocks. The structure is as follows:

Day One

  • Introduction to Mixed Reality
  • AR display technologies
  • Tracking introduction
  • Marker tracking
  • Natural feature tracking
  • Intro practical assignments

Day Two

  • AR rendering introduction
  • AR tools 1: overview
  • AR tools 2: osgart
  • AR interaction techniques
  • Designing AR interfaces
  • Designing AR interfaces exercise

Day Three

  • Camera + Optical See-Through calibration
  • Advanced 3D tracking: SLAM
  • Advanced rendering + visualization
  • AR and the Web
  • AR collaboration
  • AR authoring
  • AR evaluation

Practical Assignments

You will have the choice between several assignments in the field of vision-based tracking and mobile interaction. You will work in groups of up to two students and can choose any of the following topics:

Robust Mobile Marker Tracking (assignment for individuals)

Implement a simple Marker Tracker as in ARToolkit using OpenCV under Android. Extend the Tracker to be robust against partial occlusions of the marker edges. Balance robustness and runtime performance.

Natural Feature Tracking

Implement a  natural feature tracker using tracking-by-detection in OpenCV. Experiment with different detectors and descriptors like SIFT or Oriented BRIEF. Extend the tracker to include a iterative patch tracking phase using NCC or SSD using built-in OpenCV functions or custom code. Optional: Optimize your code to allow real-time tracking on an Android smartphone

Time-Multiplexed Screen Tracking

Design and implement time-multiplexed decoding of metamer pairs following the ideas in this paper. You can combine the time-multiplexed tracking approach with a simple marker tracker.

Color Ramp-based Screen Tracking

Implement 2D screen tracking using a color-ramp as seen here. Optional: extend your approach to make the color ramp  invisible to the user through bi-directional network communication.

2D Hand Tracking on a Planar Surface

Implement a simple 2D hand tracker using the following constraints:

  • image capture via a (wide field-of-view) head-mounted camera
  • assumption of a planar surface like a table below the hands visible
  • optional: Marker tracker available for tracking a marker on the table

Extend your approach to detect hands touching the table vs hand in mid-air or hovering above the table. For example you could employ

  • a special lighting setup for casting shadows
  • a combination with a smartphone which detects touch events via internal accelerometers

Augmented Reality Browser

Implement a simple Augmented Reality browser under Android supporting base features of browser like junaio. Support at least:

  • geolocalization and orientation tracking
  • overlay of simple 3D objects
  • selection of 3D objects using picking
  • optional: integration of web-services like the Google Places API

You can use any 3D scenegraph or low-level graphics API that  you want.

ARCardboard

Build a mono video-see-through head-mounted display using Google Cardboard. Use Unity and Vuforia or osgART as software components. Create a simple Augmented Reality demo.

BookAR

Build a cross-media application allowing text selection on printed paper which induces a text search in an associated PDF. Display the search results on or around the printed paper. You can use Unity and Vuforia for the tracking part and PDF tools like Apache PDFBox and Apache Lucene for the PDF integration.

Leave a comment