The University of Southampton
University of Southampton Institutional Repository

Material recognition for immersive interactions in virtual/augmented reality

Material recognition for immersive interactions in virtual/augmented reality
Material recognition for immersive interactions in virtual/augmented reality

To provide an immersive experience in a mirrored virtual world such as spatially synchronised audio, visualisation of reproduced real-world scenes and haptic sensing, it is necessary to know the materials of the object surface which provides the optical and acous-tic properties for the rendering engine. We focus on identifying materials from real-world images to reproduce more realistic and plausible virtual environments. To cope with considerable variation in material, we propose the DPT architecture which dynamically decides the dependency on different patch resolutions. We evaluate the benefits of learning from multiple patch resolutions on LMD and OpenSurfaces datasets.

artificial intelligence, computer vision-scene understanding, computing methodologies, human-centered computing-Human computer in-teraction (HCI), interaction paradigms-virtual reality
577-578
IEEE
Heng, Yuwen
a3edf9da-2d3b-450c-8d6d-85f76c861849
Dasmahapatra, Srinandan
eb5fd76f-4335-4ae9-a88a-20b9e2b3f698
Kim, Hansung
2c7c135c-f00b-4409-acb2-85b3a9e8225f
Heng, Yuwen
a3edf9da-2d3b-450c-8d6d-85f76c861849
Dasmahapatra, Srinandan
eb5fd76f-4335-4ae9-a88a-20b9e2b3f698
Kim, Hansung
2c7c135c-f00b-4409-acb2-85b3a9e8225f

Heng, Yuwen, Dasmahapatra, Srinandan and Kim, Hansung (2023) Material recognition for immersive interactions in virtual/augmented reality. In Proceedings of the 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023. IEEE. pp. 577-578 . (doi:10.1109/VRW58643.2023.00131).

Record type: Conference or Workshop Item (Paper)

Abstract

To provide an immersive experience in a mirrored virtual world such as spatially synchronised audio, visualisation of reproduced real-world scenes and haptic sensing, it is necessary to know the materials of the object surface which provides the optical and acous-tic properties for the rendering engine. We focus on identifying materials from real-world images to reproduce more realistic and plausible virtual environments. To cope with considerable variation in material, we propose the DPT architecture which dynamically decides the dependency on different patch resolutions. We evaluate the benefits of learning from multiple patch resolutions on LMD and OpenSurfaces datasets.

This record has no associated files available for download.

More information

Published date: 1 May 2023
Additional Information: Funding Information: This work was partially supported by the EPSRC Programme Grant Immersive Audio-Visual 3D Scene Reproduction (EP/V03538X/1) and partially by the Korea Institute of Science and Technology (KIST) Institutional Program (Project No. 2E31591).
Venue - Dates: 2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops, VRW 2023, , Shanghai, China, 2023-03-25 - 2023-03-29
Keywords: artificial intelligence, computer vision-scene understanding, computing methodologies, human-centered computing-Human computer in-teraction (HCI), interaction paradigms-virtual reality

Identifiers

Local EPrints ID: 479902
URI: http://eprints.soton.ac.uk/id/eprint/479902
PURE UUID: 34fecbd2-df0d-4d94-8483-39250237d4c3
ORCID for Yuwen Heng: ORCID iD orcid.org/0000-0003-3793-4811
ORCID for Hansung Kim: ORCID iD orcid.org/0000-0003-4907-0491

Catalogue record

Date deposited: 28 Jul 2023 16:47
Last modified: 18 Mar 2024 03:56

Export record

Altmetrics

Contributors

Author: Yuwen Heng ORCID iD
Author: Srinandan Dasmahapatra
Author: Hansung Kim ORCID iD

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×