MMsys2018 - Special Session on Immersive Multimedia Experiences Amsterdam, June 12 - 15, 2018

Special Session on Immersive Multimedia Experiences

Important information

  • Submission deadline: January 19, 2018 (extended)
  • Acceptance notification: February 21, 2018
  • Camera ready deadline: April 19, 2018
  • Online submission: https://submissions.mmsys2018.org/immersivemedia
  • Submission format: 6-12 pages, using ACM style format (double-blind). Please see the submission guidelines for more information about the process.
  • Reproducibility: obtain an ACM reproducibility badge by making datasets and code available (authors will be contacted to make their artifacts available after paper acceptance)

Special Session chairs

  • Simon Gunkel, TNO, Netherlands
  • Yutaka Ishibashi, Nagoya Institute of Technology, Japan
  • Christian Timmerer, Alpen-Adria-Universität Klagenfurt and Bitmovin, Austria
  • Igor Curcio, Nokia Technologies, Finland

Scope & goals

Advances in AR, VR and Multi-Sensory technologies are supporting compelling multimedia experiences that are more engaging and immersive than previously possible. Such immersive experiences involve a variety of sensory inputs, that can measure the current situation and environment of the user, and output different sensory outputs to the user in response, utilizing some or all of the human senses: vision, hearing, tactile, olfaction, and gustatory. This leads to the combination of many heterogeneous technologies in order to create new AR, VR and Multi-Sensory systems and ultimately to create new immersive multimedia user experiences. Many challenges remain to be addressed to maintain pace with the fast-growing requirements of these new systems and experiences. For example, Immersive Multimedia Experiences are more critical to delay and synchronization, and generally demand more resources from end devices and within the system (CPU, GPU, storage and network bandwidth). The result is numerous challenges in the whole delivery pipeline from capturing and processing a multitude of sensor information, to manipulating, streaming and visualizing different multimedia streams, while estimating the performance with new AR/VR/Multi-sensory QoE and QoS metrics. In this session, we would like to discuss those issues, connecting a broad and interdisciplinary field of research areas, including computer vision, computer graphics, mobile and embedded systems, displays and optics, user interface design, quality of experience and applications from a broad range of areas, including the entertainment, industry, military and commercial sectors.

Immersive Multimedia Experience (IME) Topics of Interest, related to AR, VR, Mixed reality and Multi-Sensory Experiences:

  • Innovative IME applications, software architectures, and systems design
  • Web-based IME
  • Networking, distributed systems and IME delivery (including QoS)
  • Over-the-top streaming of 360 degree and 3D content
  • Multimedia compression for visual search, AR and VR
  • Real-time systems and resource-constrained implementations
  • Mobile and embedded computing for IME
  • System-level energy management for mobile IME systems
  • IME sensors and display technologies
  • User interface designs for IME applications
  • QoE assessment of 360 degree and 3D immersive experiences and media content, including Multi-Sensory Experiences
  • Metadata and mapping of media and objects into IME
  • Security and privacy concerns in IME systems
  • Sensor fusion and ego-motion estimation
  • Active and passive stereo systems
  • 3D modeling and image-based localization
  • Adaptive/Personalized Multi-Sensory Experiences
  • Multi-Sensory Signal Processing
  • Human Factors (including user modelling) in IME
  • IME synchronisation
  • AR, VR, Mixed Reality and Multi-Sensory Experiences in different domains (e.g. Health, education (smart learning environments), tourism, gaming, Quality of Life, e-Commerce, Collaboration, (Tele-)Presence, Databases/Datasets, Interaction, Affect, Robotics, etc.)
  • AR, VR, Mixed Reality and Multi-Sensory Experience Standardisation

Papers should be between six and twelve pages long (in PDF format) including references prepared in the ACM style and written in English. Hence MMSys papers enable authors to present entire multimedia systems or research work that builds on considerable amounts of earlier work in a self-contained manner. MMSys papers are published in the ACM Digital Library; they are available for just as long as journal papers and authors should not feel compelled by space limitations to publish extended works on top of an MMSys paper. Authors who submit very specific, detailed research work are encouraged to use less than 12 pages. The papers are double-blind reviewed.

Technical Program Committee

  • Gregorij Kurillo, Berkeley
  • Fons Kuijk, CWI
  • Rufael Mekuria, Unified Streaming
  • Joan Llobera, i2cat
  • Peter Schelkens, Vrije Universiteit Brussel
  • Teresa Chambel, University of Lisbon
  • Ted (Zixia) Huang, Google
  • Hartmut Seichter, University of Applied Sciences Schmalkalden
  • Jens Grubert, Coburg University
  • Denis Kalkofen, Graz University of Technology
  • Chun-Ying Huang, National Chiao Tung University
  • Seong Yong Lim, ETRI
  • Emmanuel Thomas, TNO
  • Vladan Velisavljevic, University of Bedfordshire
  • George Ghinea, Brunel University
  • Serhan Gül, Fraunhofer HHI
  • Sejin Oh, LG Electronics
  • Thomas Stockhammer, QUALCOMM
  • Conor Keighrey, Athlone Institute of Technology
  • Niall Murray, Athlone Institute of Technology
  • Raimund Schatz, Austrian Institut for Technology
  • Simon Gunkel, TNO, Netherlands
  • Yutaka Ishibashi, Nagoya Institute of Technology, Japan
  • Christian Timmerer, Alpen-Adria-Universität Klagenfurt and Bitmovin, Austria
  • Igor Curcio, Nokia Technologies, Finland

Sponsors

ACM logo SIGMM logo

Co-sponsors

SIGCOMM logo SIGMOBILE logo SIGOPS logo

Gold supporters

Adobe logo

Silver supporters

Bitmovin logo Unified Streaming logo DASH-IF logo Comcast logo

CWI logo