We use cookies to give you the best experience and to help improve our website
Find out what cookies we use and how to disable themThis standard identifies augmented mutual space representation that supports the efficient creation of
shared spaces in remote AR(Augmented Reality), VR(Virtual Reality), and MR(Mixed Reality)
collaboration scenarios, along with its purpose and use cases. The space representation forms a
hierarchical graph structure in consideration of vertical and horizontal relationships between objects.
This draft defines the components for augmented mutual space representation and the details of each
type of relationship as edge data. The properties of edges are decided through semantic and geometric information of the objects in the input digital twin. This standard details the specific information about each object needed to construct node data, as well as how to construct it based on that information.
Furthermore, this draft covers the system architecture and specific use cases that support the creation
of collaborative spaces using the described spatial data. Various cases are presented that enable
efficient spatial configuration and interaction in remote collaborative environments through this spatial
data representation.
Overall, this draft addresses the concepts, syntax, and semantics of augmented mutual space
representation for AR/MR/XR remote collaboration. It specifies:
- a data representation model for expressing indoor scene data, - the data types and classes that
together constitute the data representation model,
- an application program interface that supports the generation of remote collaboration space between
dissimilar spaces, using the data representation model, and
- examples of generating remote collaboration spaces through augmented mutual space representation
between dissimilar spaces.
The purpose of augmented mutual space representation is to generate and manage shared mutual
spaces between different physical spaces for AR, VR, and MR remote collaboration. In expressing the
physical space of reality as a Digital Twin (DT) for use in AR, VR, and MR, the existing expression
method mainly has been focused on geometric information, and the segmentation of each object is
reflected in the semantic information. However, it is hard to find and manage common elements for
shared mutual space generation between the physical spaces expressed in previous representations
due to dissimilarity between spaces. Under these requirements, this draft defines a space
representation method that expresses physical space for remote collaboration space generation and
management. The representation is expected to be the basis for building an environment where one
can experience the other’s physical space beyond the limitations of time and space.
Key benefits of the framework include:
- Enhanced sense of coexistence in remote collaboration: The expansion of equally permitted
interactions in remote collaboration leads to expanding shared experiences, ultimately enhancing the
coexistence among collaboration participants.
- Enhanced sense of presence in remote space: Users enhance their sense of presence and
immersion through complex interactions with various objects in remote space even in dissimilar real
spaces.
- Enhanced efficiency of remote collaboration: The high sense of coexistence in remote collaboration
and the high sense of presence in remote space enable communication at the same level as on-site
collaboration, thereby enhancing the efficiency of remote collaboration.
Required form fields are indicated by an asterisk (*) character.
You are now following this standard. Weekly digest emails will be sent to update you on the following activities:
You can manage your follow preferences from your Account. Please check your mailbox junk folder if you don't receive the weekly email.
You have successfully unsubscribed from weekly updates for this standard.
Comment by: