ArticlePDF Available

A comprehensive survey of AR/MR-based co-design in manufacturing

Authors:

Abstract and Figures

For more than 2 decades, Augmented Reality (AR)/Mixed Reality (MR) has received an increasing amount of attention by researchers and practitioners in the manufacturing community, because it has applications in many fields, such as product design, training, maintenance, assembly, and other manufacturing operations. However, to the best of our knowledge, there has been no comprehensive review of AR-based co-design in manufacturing. This paper presents a comprehensive survey of existing research, projects, and technical characteristics between 1990 and 2017 in the domain of co-design based on AR technology. Among these papers, more than 90% of them were published between 2000 and 2017, and these recent relevant works are discussed at length. The paper provides a comprehensive academic roadmap and useful insight into the state-of-the-art of AR-based co-design systems and developments in manufacturing for future researchers all over the world. This work will be useful to researchers who plan to utilize AR as a tool for design research.
Content may be subject to copyright.
Vol.:(0123456789)
1 3
Engineering with Computers
https://doi.org/10.1007/s00366-019-00792-3
ORIGINAL ARTICLE
A comprehensive survey ofAR/MR‑based co‑design inmanufacturing
PengWang1· ShushengZhang1· MarkBillinghurst1,2· XiaoliangBai1· WeipingHe1· ShuxiaWang1·
MengmengSun1· XuZhang1
Received: 11 March 2018 / Accepted: 3 June 2019
© Springer-Verlag London Ltd., part of Springer Nature 2019
Abstract
For more than 2 decades, Augmented Reality (AR)/Mixed Reality (MR) has received an increasing amount of attention by
researchers and practitioners in the manufacturing community, because it has applications in many fields, such as product
design, training, maintenance, assembly, and other manufacturing operations. However, to the best of our knowledge, there
has been no comprehensive review of AR-based co-design in manufacturing. This paper presents a comprehensive survey
of existing research, projects, and technical characteristics between 1990 and 2017 in the domain of co-design based on AR
technology. Among these papers, more than 90% of them were published between 2000 and 2017, and these recent relevant
works are discussed at length. The paper provides a comprehensive academic roadmap and useful insight into the state-of-
the-art of AR-based co-design systems and developments in manufacturing for future researchers all over the world. This
work will be useful to researchers who plan to utilize AR as a tool for design research.
Keywords Augmented reality (AR)· AR-based co-design· Collaborative work· Virtual reality (VR)· CAD
1 Introduction
This survey paper reviews the use of Augmented Reality
(AR)/Mixed Reality (MR) to support co-design in manufac-
turing. Design is a multi-stage, creative, collaborative and
iterative engineering process [1] which involves significant
research, inherently resource- and time-intensive activities
to generate innovative ideas [2]. Co-design is a ‘human-
centered’ technical activity that allows multiple people to
work together collaboratively on design projects, across
different times, spaces, disciplines, and cultures [3]. It is
designing with people not merely for people, and designers
could propose more innovative solutions when working in
a co-design situation [4, 5]. However, co-design can have
many meanings, such as remote collaboration, co-located
collaboration, real-time multi-user collaboration, independ-
ent view collaboration, etc.
In today’s highly competitive connected business envi-
ronment, there is a trend towards manufacturing glo-
balismwhich drives the development of co-design for com-
panies to continuously stay competitive. The manufacturing
industry is facing the challenge of designing innovative
products in a timely response to the market. At the same
time, the role of consumers has changed, from passively
accepting products to initiative participating in the design
process [6]. Typically, with a complex design task, the com-
munication and collaboration among team members are
essential to enable the design to be carried out effectively.
This places pressure on designers for faster turnaround time,
the lower margin for error, and more efficiency in the design
process [1].
There is a primary disconnect between the wealth of digi-
tal virtual content available to us and the physical world.
This gulf between the real and digital worlds limits our abil-
ity to make use of the torrent of information and insights
[7]. However, AR/MR, a set of technologies that seamlessly
overlays virtual content on the physical world, supports more
natural and intuitive spatial manipulation of 3D virtual
objects, improves the visualizing by the virtual-real fusion
* Shusheng Zhang
zssnwpu@163.com
* Xiaoliang Bai
bxl@nwpu.edu.cn
1 Key Laboratory ofContemporary Designing andIntegrated
Manufacturing Technology, Ministry ofEducation,
Cyber-Physical Interaction Lab, Northwestern Polytechnical
University, Xi’an710072, China
2 Empathic Computing Lab, University ofSouth Australia,
MawsonLakes, Australia
Engineering with Computers
1 3
[8], and has the potential to support awareness information
during co-design shifting from working alone to working
together [9]. In the coming months and years, AR/MR is
going to transform how we learn, make decisions, and inter-
act with the physical world. It will also change how enter-
prises serve customers, design products, and manage their
value chains, and, eventually, how they compete [7].
Therefore, AR/MR has the potential to significantly
improve the design process and address these design pres-
sures. In recent years, AR/MR has been used in many appli-
cation areas such as medicine [10], manufacturing (design,
maintenance, repair, assembly, etc.) [1116], and educa-
tion [17, 18]. In addition, compact and light-weight head-
mounted displays suitable for AR/MR are becoming readily
available, such as Google Glass,1 Microsoft’s Hololens,2
DAQRI Smart Helmet,3 Zspace,4 and Mate2.5 This means
that the AR/MR software and hardware is becoming avail-
able to support the design process.
AR/MR has previously been used as an interface for CAD
tools, enabling the designer to see 3D virtual design models
superimposed over a real environment [13, 19, 20]. These
AR/MR-based CAD systems can also be used in the co-
design process. A typical scenario (see Fig.1) of AR/MR
co-design could involve a team of designers from the same
or different disciplines, co-located or dispersed geographi-
cally, who can share the augmented design environment,
naturally communicate with each other, and work in parallel
to perfect product designs [21, 22].
Consequently, the research on AR-based co-design is a
promising and growing area in manufacturing. However,
to the best of authors’ knowledge, there have not been any
comprehensive reviews of AR/MR-based co-design in manu-
facturing. More specifically, such a review is needed to help
identify the unsolved research problems, and guide people
conducting further work in the field. This paper addresses
this need and provides a comprehensive survey of AR/MR-
based co-design between 1990 and 2017.
(A) The co-located work scenario [22].
(B) The remote co-design scenario [23].
The main contributions of this paper are as follows:
(a) It provides a comprehensive overview of existing works
in AR/MR-based co-design in the manufacturing tech-
nology community.
(b) The advantages and typical framework of AR/MR-
based co-design in the manufacturing are discussed.
(c) It provides a summary of the key features and technolo-
gies of AR/MR-based co-design.
(d) Current important areas and future trends of research
for AR/MR-based co-design are presented.
The next section presents an overview of AR/MR-based
co-design from three perspectives: (1) traditional co-design,
(2) the advantages and typical architecture of AR/MR co-
design, and (3) a comprehensive survey of existing systems.
Section3 summarizes the basic requirements of AR/MR
co-design technology, AR/MR-based co-design features, and
modeling technology. Then, Sects.4 and 5 give a summary
Fig. 1 Typical scenarios of AR/MR co-design
1 http://www.googl e.com/glass /start .
2 https ://www.micro soft.com.
3 https ://www.daqri .com.
4 https ://zspac e.com.
5 http://www.metav ision .com.
Engineering with Computers
1 3
of current issues and future trends in AR/MR-based co-
design, followed by discussion and conclusions.
2 Survey ofAR/MR‑based co‑design
2.1 Traditional co‑design
Product design is a vital step which has to be considered at
the early stages of the entire product development process
[24]. A typical co-design CAD system requires two types
of capabilities: distribution and collaboration. Distribution
functions extend the capabilities of CAD systems and ena-
ble them to support co-design activities, while collabora-
tion functions integrate individual systems to accomplish
a co-design assignment. Although these two capabilities
have different priorities, they are closely interconnected
and complementary. The traditional design uses a serial
method dividing the design process into sub-tasks that are
consecutively executed. It is, however, fragile and inflexible.
In addition, it often needs numerous iterations which makes
the design process time-consuming and expensive and limits
the space of design alternatives that can be attempted [25].
In contrast, co-design tries to concurrently dispose of these
problems.
2.1.1 Typical modern co‑design CAD system
A lot of emerging technologies have been proposed to imple-
ment in co-design systems, including web-based systems,
agents, and cloud computing. Research has been carried out
to study how these technologies can be used to help design-
ers to perform co-design. For example, web-based [2631],
agent-based [3236], and cloud-based [37, 38] collaborative
work systems have been used to support co-design tasks.
There is also a considerable body of work [25, 26, 3943]
which concentrates on co-design CAD systems from differ-
ent perspectives. From this published literature, we can draw
the conclusion existing co-design CAD systems are typical
of the following types:
1. Web-based collaboration: a web-based system which can
easily access catalog and design information, and com-
municate among cross-disciplinary distributed design
team members without any geographic restrictions, even
across time zones [25].
2. Agent-based collaboration: an agent-based collabora-
tive system which shares relevant information with
designers. An agent mechanism can simplify the system
architecture and can respond to changes when any col-
laborators change their models [25]. It can appropriately
describe the world by focusing on objects rather than
functions and handle rapidly changing conditions [36].
3. Cloud-based collaboration: a cloud-based collabora-
tive system provides dramatically more benefits than
web-based or agent-based systems. This is because it
is characterized by flexibility, high performance, ubiq-
uitous access, and also processing/analyzing big data
by an open-source programming framework, support-
ing communication and sharing information by social
media, and being capable of answering users’ queries
by an intelligent search engine, etc. [37, 44].
In addition, in [37], cloud-based, web-based, and agent-
based approaches are synthetically compared from the
various perspectives, such as architecture, data storage, and
communication.
They typically applied to modern commercial CAD sys-
tems, where the 3D CAD models of the part to be designed
are displayed on computer screens to assist the designer.
However, there are several practical issues that have not been
solved explicitly and satisfactorily. For example, the design
models’ spatial relationships can be vague, since the display
of the blueprint is often shown on a 2D monitor [13]. This
means that designers must decompose 3D design tasks into
2D or 1D modeling operations with a 2D interface, which
can make the design process/task lengthy and tedious [45].
Another issue is that task and communication space is sepa-
rated in the co-design process. That is, when collaborators
want to exchange some innovative ideas or improved plan,
they first look at the screen to locate where there is a prob-
lem, and then communicate with each other [46], affecting
natural communication. Next, the collaborative systems’
human–computer interaction (HCI) needs to be improved
[47]. For example, when designers work, they often cannot
use direct interaction with their hands, but use 2D interfaces
such as a monitor, keyboard, and mouse. Finally, empathy
is not considered in the most commonly used collaborative
CAD systems. In practical terms, creating empathy can
improve design efficiency by sharing emotions and helping
people better understand one another during the co-design
process [4850].
2.1.2 VR‑based co‑design CAD systems
Based on the recent advances, VR and AR technologies have
the potential to free designers from the constraints imposed
on them by the traditional 2D interfaces, enabling more
intuitive and efficient interaction with 3D virtual models
[51]. VR technology plays an important role in simulating
advanced 3D HCIs, and has the potential to improve prod-
uct design due to its many advantages [45, 52], e.g., good
immersion, cost-effective strategy, and providing special
scenarios (hazardous or dangerous environments) that would
be difficult/impossible to access, etc. VR is more mature
than AR/MR, and thus, more VR solid modeling systems
Engineering with Computers
1 3
have been published. Some research [45, 5358] on VR
CAD systems has been reported. Gao etal. [45] introduced a
constraint-based method to create, manipulate a solid model
by direct 3D manipulations and voice commands. Jezernik
and Hren [53] discussed a solution to integrate CAD and VR
databases in design processes. Wang etal. [56] and Zhong
etal. [54, 55] presented a series of work on a hierarchically
structured and constraint-based data approach to intuitive
and precise solid modeling. Kan etal. [57] described an
Internet-based VR co-design system demonstrating the
feasibility of co-design for companies focusing on low-
cost products. Mahdjoub etal. [58] introduced a VR- and
multiagent-based co-design for usability approach in a PLM
environment.
Compared with these VR-based systems, an AR-based
co-design system has some advantages in product design,
as follows.
i. In an AR/MR-based co-design environment, both real
and virtual world can be seen when users design the
product. In other words, it is a semi-immersive design
environment which provides a more safe and real feel-
ing. Therefore, the users’ realistic experience is never
lost and interaction is more intuitive as well as natu-
ral, thus improving the usability and credibility of the
design process [59, 60].
ii. Collaborators’ empathy can be easily taken into
account in the AR/MR-based design to improve the
quality and efficiency of communication.
iii. Compared with VR where designers are entirely
immersed in a virtual environment, AR/MR enriches
the way that designers experience the design process
in the virtual-real fusion environment [61].
2.2 The advantages ofAR/MR co‑design
There are a number of advantages using AR/MR co-design
for it can combine AR/MR/VR and traditional 3D CAD sys-
tems, as shown by the recent study [62]. More precisely,
first, designers can share the augmented workspace. In an
augmented environment, multiple participants can simulta-
neously share both the physical space surroundings and the
virtual world through AR/MR displays, and the users can
collaborate taking full advantage of the real objects within
the same shared-augmented workspace as the communica-
tion/interaction tools [63, 64]. Real concept sketches and
virtual 3D models can be combined in the AR/MR co-design
process. Therefore, designers can focus on creating the prod-
uct using the physical context.
Second, it is possible to share empathy to improve com-
munication efficiency. Traditional methods for co-design
system use tools such as video and audio-only conference
calls where participants can only view collaborators’ faces
or workspaces, or nothing at all. However, an AR/MR-based
design environment can provide increased awareness of the
user’s context [65]. Communication cues and awareness
information of other designers may affect the co-design effi-
ciency. These include the designers’ speech and nonverbal
information, such as head direction, body languages, spatial
relationships, gestures, facial expression, and acoustic indi-
cators [66]. From these kinds of information, a designer can
quickly know their partner’s will [47, 63, 64], and can aid
the understanding of designers’ activities [66, 67].
Third, compared with VR where designers are entirely
immersed in a virtual environment, AR/MR enriches the
way that designers experience the real world by embedding
interactive virtual objects within the physical world [61].
Moreover, an AR/MR-based setting is easy and economical
to construct, compared to VR-based systems which need
to model the entire environment [16, 68]. AR/MR has the
capacity to partially use the VR environment rather than
using a fully immersive VR at a high-cost computation.
Finally, AR/MR provides more natural and intuitive
interactivity and 3D model viewing than with the traditional
CAD systems. Designers can see 3D virtual objects floating
in space in front of them which they can interact with using
natural gesture input. Taking into consideration these AR/
MR merits and the current problems of design in manufac-
turing, AR/MR could be one of the most promising tools for
mechanical co-design.
2.3 A typical architecture ofAR/MR‑based
co‑design
As above mentioned, AR/MR technology has many advan-
tages. Therefore, AR/MR could be one of the most prom-
ising technologies to use in the collaborative mechanical
co-design process [69]. Considering the functions of the
systems and the roles of the collaborators’ participating in
a co-design activity, typical AR/MR-based co-design sys-
tems can be roughly grouped into two types [13, 26, 39],
(1) visualization-based co-design and (2) simultaneous co-
design. Visualization-based co-design provides an AR/MR-
based environment in which the product virtual models can
be visualized, annotated, and inspected in a light-weight way
by participants assisting with co-design activities. Simulta-
neous co-design provides an AR/MR-based environment in
which the designers can be capable of collaborative interac-
tion and simultaneous co-modeling.
The two types of systems are usually developed using a
client/server framework [13, 70]. In the first type, the main
function of the client is supporting the 3D model visualiza-
tion and manipulation by an intuitive interface, while the
server mainly supports CAD modeling kernels and informa-
tion exchange. In the second type, the main function of the
client is supporting CAD modeling kernels, allowing users
Engineering with Computers
1 3
to create or modify and manipulate a model in the clients’
workspaces, and the server mainly supports information to
be kept consistent in the co-design process. The commu-
nication between server and clients is based on the stand-
ard TCP/IP socket communication libraries. Moreover, it is
necessary to synchronize the manipulation of the 3D design
model and avoid confusion in the modification/manipulation
of the virtual model. The central server has three modules:
(1) the client connection module, (2) the modeling module,
and (3) the central database module. For more details of
the mechanism and modules functions, you can see [13, 71,
72]. The server always listens for incoming connections and
builds a communication socket. To handle multiple clients,
it will create a new pair of threads for each received co-
design successful connection client. The server main thread
will be created for each client to deal with all of the client/
server communication during the co-design process. The
client thread receives all updated design information from
other clients and sends those updates to other clients via the
server. The socket connection between the client and the
server is broken when the client quits.
According to the design tasks’ features, these two archi-
tectures can be adopted at the different stages of co-design.
At the early stages, there will be relative considerable
changes for decision-making, overall blue plan, 3D virtual
model structure, overall functional structure, even the par-
tial structures, etc. To provide a flexible and personalized
platform, the simultaneous co-design, maybe, is a good
choice letting different roles’ designers take advantage of
their talents and focusing on the design task itself. At the late
design stages, the visualization-based co-design can possibly
improve the efficiency under full consideration of avoiding
conflicts and optimizing data transmission. Therefore, we
think that the ideal AR/MR co-design platform should be
supposed to integrate these two characteristics and support
different capabilities at the different co-design stages.
In addition, a typical AR/MR co-design system should
support various flexible interaction, such as, AR annotations
creation and sharing, ‘editing right’ mechanism, compat-
ibility with traditional CAD systems, sketch-based scheme,
revision co-design history, supporting natural and intuitive
inputs, avoiding confliction of rendering, cloning, branch-
ing, and merging paths, etc. Moreover, the studies can learn
many implications from AR/MR design systems: Spacetime
[73], MobiSweep [74], Co-3Deator [62], Window-Shaping
[75], TMotion [76], skWiki [77], even in the experience
scheme of TortoiseSVN.6 More specifically, Xia etal. [73]
explored the novel three interaction concepts (i.e., container,
parallel objects, and avatar objects) for multi-user collabora-
tive editing in 3D VR environment. Piya etal. [62] presented
a sketch-based collaborative 3D modeling platform based on
the concept: “all design is redesign”. More importantly and
interestingly, Zhao etal. [77] developed a web collaborative
creativity architecture: skWiki, using the concept of paths as
trajectories of persistent state at different co-design stages.
In particular, this skWiki framework has intrinsic support
for collaborative editing, including cloning, branching, and
merging paths edited by multiple contributors.
2.4 The comprehensive review
2.4.1 Methodology
Our survey method is characterized by iterative filtering and
classification processes shown in Fig.2. First, we applied
the different filtering steps on the initial list of publica-
tions through Web of Science using the keywords: “aug-
mented reality (AR)/mixed reality(MR)/design/collaborative
design(co-design)/collaboration/collaborative”, and then
refined the results by Web of Science Categories such as
computer science and manufacturing field. This resulted in
a collection of 249 publications (see Fig.3).
Second, to extend the survey, the articles were searched
out from the following online databases (i.e., Elsevier Sci-
ence Direct, Taylor and Francis, IEEE Xplore, Springer
Link, and ACM Digital Library, etc.) using the same key-
words used by first preselection. In order not to miss the
most relevant papers, we also searched papers in AR techni-
cal journals and conferences such as ISMAR, SIGGRAPH,
and VR. A total of 187 unique papers were found to meet
these criteria (see Fig.3).
Third, we removed the articles that, although contain-
ing the term “Augmented Reality(AR)/Mixed Reality(MR)
design/co-design”, are not actual AR/MR-based co-design
research papers. Such “false positives” may just mention AR
co-design in keywords, or future work section, or contain
some AR/MR-based co-design literature references.
Fourth, we extend our survey to those publications that
were not included after keywords selection by searching ref-
erences of those papers to get a better estimate of the amount
of AR/MR-based co-design in manufacturing.
Finally, we considered how much impact each paper had,
by measuring its Average Citation Count (ACC) using the
following formula [78]:
where total lifetime citation is got by Google Scholar.
Based on this formula, we included all papers that had an
ACC of at least 0.5, showing that they had at least a mod-
erate impact in the field. This resulted in a final set of 67
papers that we reviewed in detail Fig.5. To review these 67
papers, we focused on the following attributes: (1) key top-
ics [e.g., HCI, interaction techniques, design review (DR),
ACC =Total lifetime citationlifetime (in years),
6 https ://torto isesv n.net.
Engineering with Computers
1 3
empathic computing, etc.], (2) types of experiment (e.g.,
pilot, formal, heuristic, etc.), (3) system types (e.g., visu-
alization-based, simultaneous, co-located, and remote AR-
based co-design.), (4) key features (the main contributions
or highlights, etc.), (5) types of tracking (e.g., marker-less,
marker-based, sensor-based, etc.), (6) displays used (e.g.,
desktop, handheld, and HMD) and ACC. Types of tracking
and displays used are discussed in detail in Sect.2.4.2.
Each paper on AR-based co-design published from 1990
to 2017 is reviewed to determine its eligibility and level of
relevance. The articles must be AR work in areas of co-
design, and some articles that concentrate on AR collabora-
tion are also included in manufacturing.
2.4.2 Major ndings
In this survey, we provide a high-level analysis of the data.
Overall, in the 436 (249 in Web of Science and 187 in 8
Fig. 2 Methodology flowchart
Fig. 3 Result of the initial search from web of science and publisher
databases
Engineering with Computers
1 3
Table 1 Major AR/MR co-design research
Teams/systems Year System types Contributions/highlights ACC
Simultaneous Visual-
ization-
based
Remote Co-located
Ahlers [79] 1995 Construct co-design applications by
distributed AR
Focus on distribution and interac-
tion aspects of the system
4.82
Transvision
[47]
1996 √ √ Propose an AR-based co-design
architecture
Consider mutual awareness
7.38
VLEGO [63, 80];
SeamlessDesign [81]
1996–1999 √ √ Propose SeamlessDesign and
VLEGO system
Confirm the effectiveness of AR-
based co-design more than VR
and traditional co-design
3.19
1.74
1.89
Hagen etal.
[82]
1998 √ √ Present an interactive AR-based
co-design system and demonstrate
its functionalities
0.89
Studierstube
[51, 83]
1998–2002 √ √ Propose a client -server architecture
Support 3D operations
18.26
25.53
Grønbæk etal. [84]. 2001 √ √ Propose an integrated set of interac-
tive room technologies supporting
co-design
1.81
Mark’s team [48, 60, 65, 8588] 2002–2017 √ √ Propose conception of empathy
glasses and empathic computing
Conduct the first experiment on
a mixed-space collaboration
between AR and VR environment
First using gaze tracking for remote
collaboration in head-mounted
systems
Study empathic computing in the
AR collaborative work
3.00
3.00
6.13
6.50
8.00
5.00
Regenbrecht etal. [89, 90]. 2002 Present a general concept for a col-
laborative tangible AR system
7.80
5.07
K1inker [91]Develop a proof of concept AR
system for car designers
5.33
Liu etal. [92]. Propose a binary square marker
recognition technique
0.80
Verlinden etal. [93]. 2003 Propose a WARP (Workbench for
Augmented Rapid Prototyping)
system focusing on the combina-
tion of RP and TUI
5.29
Hirakawa [94] 2004 Propose a transparent display in AR
system
1.46
Penn [95] Support architectural and urban
design
1.46
ARTHUR [96] Propose the ARTHUR system, an
AR-enhanced round table to sup-
port complex design and planning
decisions
8.00
Ong’s team
[13, 71, 72, 97, 98]
2004–2010 √ √ Propose the AR-based co-design
architecture, and the mechanism
of keeping model consistency
Propose a virtual interaction panel
(VirIP) to control AR systems
9.86
0.82
4.56
2.23
1.00
Engineering with Computers
1 3
Table 1 (continued)
Teams/systems Year System types Contributions/highlights ACC
Simultaneous Visual-
ization-
based
Remote Co-located
Maher and Kim [99, 100] 2005 Study on the impact of TUI and
GUI on cognition
1.25
2.50
Park and Lee [101] 2006 Propose augmented foam of AR
techniques to physical blue foams
1.27
Sidharta etal. [102]√ √ Propose an alternative to 2D desk-
top-based interfaces(augmented
tangible interface for DR
2.27
Sakong and Nam [103]√ √ Study on the impact of visual and
physical cues in distributed 3D
AR-based co-design
1.64
Wang etal. [2, 104107] ;
TeleAR [23]
2006–2011 √ √ Present a framework for an intel-
ligent agent-based AR system;
Explore an innovative co-design
space through MR boundaries;
propose a new TeleAR system to
enhance the distributed cognition
among remote designers
4.00
1.88
4.09
0.50
11.00
Seichter [108] 2007 Investigate the impact of a direct
manipulating TUI and a pen-like
3D manipulation interface on the
design process
2.40
IMPROVE [109, 110] Collaborative DR 2.60
1.00
Chastine etal. [111] 2008 Study on the Effectiveness of Vir-
tual Pointers in Collaborative AR
2.22
Lee etal. [112]. 2009 Propose a new AR/RP-based
approach to co-design evaluation
2.25
Hammad [113]Propose a distributed AR methodol-
ogy for visualizing collaborative
construction tasks
5.00
Gu etal. [3]. 2010 √ √ Study on the effect of 3D virtual
worlds and TUI on architectural
design
7.50
Caruso and Re [114] Describe an AR-based DR system
that allows overcoming some
issues related to the visualization
and to the interaction with the
virtual objects
1.57
IMMIView [115] Propose a multi-user solution for
co-design review in real-time
0.86
Januszka and Moczulski [22,
116119]
2006–2011 Present an AR-based co-design for
aiding product design and devel-
opment of mechanical systems;
application of an integrated CAD
and AR system
1.83
1.14
1.00
0.55
1.50
Thomas [120] 2011 Present SAR Technology for help-
ing co-design of complex physical
environments
1.50
Poppe [121, 122] 2011–2012 Evaluate an AR-based co-design
modeling system
1.50
1.60
Ko and Chang
[123]
2012 Investigate and evaluate interaction
with an AR-based co-system in
interdisciplinary design teams
0.67
Engineering with Computers
1 3
Table 1 (continued)
Teams/systems Year System types Contributions/highlights ACC
Simultaneous Visual-
ization-
based
Remote Co-located
Gauglitz etal. [124, 125]. 2012–2014 √ √ Propose a real-time computer
vision-based tracking and mod-
eling of the environment to the
remote collaborative work
16.33
11.20
BeThere [126] 2014 √ √ Explore 3D input for mobile col-
laborative interactions
Demonstrate the feasibility of the
proposed system by a user study
15.00
TeleAdvisor [127, 128] 2012–2015 √ √ Propose a versatile projection-based
AR system for remote collabora-
tion
1.50
11.40
DualCAD [20] 2016 Integrate AR with a desktop GUI
and smartphone interaction
3.00
Leman etal.
[129, 130]
2016–2017 Study the co-design cognition and
interaction behavior of the design-
ers in the AR-based co-design
1.00
Jeronimo [131] 2017 Propose 3D user interfaces for VR/
AR-based co-design
Table 2 Key topics of AR/MR-based co-design publications
Key Topics Years 1994–1997 1998–2001 2002–2005 2006–2009 2010–2013 2014–2017
Evaluation usability/
effectiveness
[47, 80] [63] [97] [71, 72, 98, 112, 113]
[106, 108, 111]
[104]
[2, 13, 22, 116,
120] [114, 115,
121123]
[23, 65, 88, 124, 125]
[20]
29
Design review (DR) [96] [106, 109, 110] [114, 115] – 6
Interaction techniques
Multimodal interac-
tion
– – [71] [115] [129] 3
TUI – – [89, 90, 93, 95,
96, 99, 100]
[101103, 106, 108]
[112]
[3] – 14
Others [47, 80] [51, 81, 84] [83, 94, 95, 97] [72, 98, 107, 109, 110,
117119]
[13] [23, 126, 127] [129,
130] [131]
24
HCI: Experiment type (Pilot, Formal), and UX factors
Pilot – – [94, 99, 100] [103] – [48, 85, 88] [20] [129,
130]
10
Formal – [63] [60, 132] [104106, 108, 111] [2, 3, 22, 116, 123] [124, 125] 15
UX factors [63] [60]
[99, 100]
[105] [106, 111] [2, 22, 116] [3] [88] [48] [85] [129,
130].
16
Empathic computing [23, 48, 8587] 5
Rendering
OpenGL – – [60, 92, 132] [71, 72, 98] [13, 121, 122] – 9
OpenSceneGraph
(OSG)
– – [109, 110] [115] [23, 65] 5
VRML – [82] [92, 95, 96] [117119] [22, 116] – 9
CAD softwares [83] [113] [72, 98]
[117119]
[22, 116]
[13]
– 10
Engineering with Computers
1 3
online databases) papers, there were 67 studies reviewed
(see Table1).
Table1 summarizes the major worldwide research using
AR/MR in mechanical co-design. In this paper, we aim to
provide a brief survey of the characteristic features of AR/
MR-based co-design. We provide a concise review of the
key topics (see Table2), system types, key features, track-
ing types, displays used, and ACC with reference to seminal
publications.
The system types are both visualization-based AR/MR
co-design and simultaneous AR/MR-based co-design. They
can be divided into co-located and remote co-design accord-
ing to whether the collaborators are in the same place or
not. From Table1, we can see that the main system type is
the visualization-based (41 articles, 61.19%), in which co-
located and remote system are 73.17% and 26.83%, respec-
tively. The simultaneous is responsible for 26.87%, and only
five can support co-located and remote work at the same
time.
The graph (Fig.4) shows the trend of the number of
papers on AR-based co-design. The bar chart illustrates
that few studies are published before 2000 and there are the
most publications between 2006 and 2009. However, more
researchers are increasingly concerned about this field with
the development of AR technology, because the number of
papers keeps a relatively stable level per year from 2002 to
2017.
To ensure the quality of the reviewed papers, we con-
sidered how much impact they had with ACC. It must be
pointed out that although three papers’ ACC is zero, we do
not exclude them for they are all published in 2017. Conse-
quently, 64 papers’ ACC is statistically analyzed. It can be
seen from Fig.5 that, generally, more than half (36, 56.25%)
of the reviewed papers’ ACC is at least 2.
Then, we recorded the displays used in the AR/MR-based
co-design. The pie graph (Fig.6) depicts the proportion of
the displays used in studies. It can be classified into four
types as follows: the projector-based, desktop, handhelds,
and HMD. Overall, HMD has the largest proportion, which
0
2
4
6
8
10
12
14
16
18
20
3
5
13
19
13 14
Articles Counts
Years
Fig. 4 Publications on AR/MR-based co-design
7
21
11
9
5
11
0
5
10
15
20
25
[0.5,1)[1,2) [2,4)[4,6) [6,8)[8,26)
Articles Counts
ACC
Fig. 5 ACC of the survey publications
13%
23%
17%
70%
30%
47%
Projector-based Desktop
HandheldsVideo see-through HMD
Optical see-through HMD
Fig. 6 The proportion of displays used
0
2
4
6
8
10
12
14
16
18
20
1994-19971998-2001 2002-20052006-2009 2010-20132014-2017
The numbers of displays
Years
DesktopHandhelds HMD Projector-based
Fig. 7 The trend of displays used
Engineering with Computers
1 3
accounts for 47% (video and optical see-through HMD
are 33% and 14%, respectively.). The percentage increases
slowly till between 2006 and 2009, and it is dropping stead-
ily after that (Fig.7). It can be seen that the desktop makes
up 23%, and then, next is handhelds with 17%, followed by
the projector-based constituting 13% (Fig.6). In general,
more and more the desktop, handhelds and projector-based
displays are employed (Fig.7). There are 33 papers that used
both and nine papers used three displays. It should be noted
that the desktop is nearly half as much as that of the HMD,
and the video see-through HMD is more than twice as much
as the optical see-through HMD. And it is also interesting
to note that the sale of the projector-based display is almost
the same with the optical see-through HMD. From Fig.7,
between 1994 and 2001 (eight papers in our review), only
one paper used the projector-based and two papers used
handhelds. However, since 2002, the number of papers
using handhelds started to increase. This shows that MAR
is beginning to be popular, at least in terms of publications
with AR/MR-based co-design.
The real-time tracking is one of the crucial tasks in AR/
MR-based co-design. The trend of the implementation of
tracking methods in AR/MR-based co-design is shown in
Figs.8 and 9. It can be seen that the marker-based method
like ARToolkit is the most popular in the AR/MR-based
co-design system. This is reasonable, possibly because the
marker-based method is one of the fundamental technolo-
gies to make AR/MR-based co-design more simple and fast.
In addition, it provides a robust and stable solution to some
extent for the prepared design workspace. It should be noted
that the marker-based method is more than 2.5, 4, and 5
times as much as the marker-less, the sensor-based and oth-
ers method, respectively (Fig.8). And it is also interesting
to note that the trend of the marker-based method is very
similar to the HMD used (Figs.7 and 9). From Fig.9, we can
see clearly that the general trend of the marker-less method
slightly increased, and the trend of the sensor-based method
gradually decreased on average. The other method means
that the reviewed papers do not specify which the tracking
type used.
With the development of AR/MR-based co-design
research, many different related research directions have
been investigated and discussed extensively. Table2 shows
some seminal papers published in these categories over time
(e.g., usability/effectiveness evaluation, empathic comput-
ing, HCI, DR, etc.). Of course, some papers discuss several
topics and are not limited to one category.
We find that there are lots of papers on the study of HCI
(25 articles, 37.31% in total) to explore for comparing AR/
MR technology leading to designers’ changes of behaviors,
cognitive rules, and preferences from different perspectives
of co-design. The main experiment types in HCI are the
pilot test and formal user study, and over the years, there
is an obvious increase in formal user studies, but few pilot
studies. In terms of UX factors (e.g., preference, ease of use,
perceived performance, intuitiveness and naturalness, user
attitude, etc.), the most commonly used experimental task
is filling out questionnaires. However, there are three [3,
129, 130] adapted the protocol analysis method to study per-
ception, action, and cognition of designers during AR/MR-
based co-design. This suggests opportunities for increased
research in the use of field studies and a wider range of
evaluation methods. In addition, there are most papers on
evaluation usability/effectiveness (29 articles, 43.28%), in
which 11 articles (37.93% in this topic 29 articles) evaluated
the system by the case study. HCI accounts for 44.83% of
evaluation system usability/effectiveness. There is a close
relationship between evaluation usability/effectiveness and
HCI. However, they have different focuses in some cases.
Besides, the category of interaction techniques can be
divided into three sub-categories, namely, multimodal inter-
action (3 articles, 4.47% in total), TUI (14 articles, 20.90%),
and other interaction techniques (24 articles, 35.82%).
Although multimodal interaction has a long history, it is
under-rated, because there are few publications on this topic
13%
10%
56%
21%
Sensor-based Others Marker-based Markerless
Fig. 8 The proportion of tracking type
0
2
4
6
8
10
12
14
1994-19971998-2001 2002-20052006-2009 2010-20132014-2017
Sensor-based Marker basedmarkless Others
Fig. 9 The trend of tracking type
Engineering with Computers
1 3
in the past; however, it is gradually attracting greater atten-
tion recently. The TUI research topic has a relatively steady
number of publications per year from 2002 to 2011. Most
of the articles that fall into other interaction techniques are
related to the topics of interaction by gestures, the virtual
menu, 3D input UI and 3D input devices, etc.
Furthermore, the collaborative design review (DR) is
another worth point. It is one of the most common applica-
tions of co-design in the industry [109]. We find six articles
on AR/MR collaborative DR, in which three articles are AR/
MR-based architectural collaborative DR for we think that
researchers can draw on from them. It can be seen from
Table2 that empathic computing, recently, is a new field
in AR-based co-design. Although empathy design has been
studied several decades, few papers systematically explored
empathic computing in AR/MR-based co-design. Thus, we
find that five articles (7.45% in total) are the most relevant
papers on this survey to emphasize it is, maybe, a new prom-
ising research direction.
Another important point is the rendering of the 3D
model. There are 24 articles (35.82% in total) which are
clearly pointed out the rendering library used. The com-
mon used are OpenGL (9 articles, 37.50% in 24), OSG (5,
20.83%), VRML (9, 37.50%), and the CAD software (9 arti-
cles, 37.50%) such as Solidworks API, CATIA, and Open
Inventor. Specifically, a part of studies will use two librar-
ies; for example, the refs [13, 72, 98] used OpenGL and
Solidworks API, and the refs [22, 116119] used VRML
and CATIA. Interestingly, almost all of the CAD software-
rendering methods have a relation with the desktop display
(see Tables1 and 2).
Finally, the refs [93, 101, 102, 112] focus on the com-
bination of Rapid Prototyping and AR/MR to design prod-
uct, which possibly is a new research direction. It must be
pointed out that we collected nine papers [48, 65, 85, 88,
124128] (13.43% in total; 4 articles’ ACC is at least 11.00)
focusing on AR-based remote collaborative work for two
reasons. On one hand, they are the latest achievements on the
AR/MR-based collaborative work; On the other hand, their
mean ACC is 8.41 (standard error is 1.79), which shows that
they had a considerable impact in the field. We anticipate
that the future work of AR/MR-based co-design can draw
lessons from them.
3 AR/MR co‑design key features
andtechnologies
3.1 Basic requirements ofAR/MR technology
This section reviews the basic requirements of AR/MR tech-
nology for co-design. We focus on three main factors here;
registration and tracking, augmentation, and rendering.
3.1.1 Registration andtracking
Registration and tracking are crucial tasks in AR/MR appli-
cations. Without accurate tracking and registration, it is
impossible to seamlessly merge the virtual and real objects.
Certainly, it is also very important in the AR/MR-based co-
design to keep real-time tracking suited for design scenarios.
In an AR/MR-based co-design system, the design model
will be floating when the registration is not accurate enough
[133]. Hence, registering accurately is important, requiring
considerable accurate position and orientation tracking to
register virtual information [134]. In addition, it can provide
each designer with a perspective correct view, and ensure
that the same place is in the workspaces when designers
point to the design model [135].
Other crucial requirements of AR/MR-based co-design
are tracking accurately where the designers are located in
reference to the workspace when they move their body,
heads, and eyes. In general, AR trackers used in co-design
systems need to satisfy high accuracy, low latency as well
as jitter, and robust operations under various design environ-
ments [16]. Currently, lots of commercial trackers applicable
for AR-based co-design are available, e.g., marker-based,
mechanical, magnetic, sensor-based, marker-less, and hybrid
systems which combine the advantages of two or more
approaches [134]. Shen etal. [13, 14, 72] presented that reg-
istration and tracking are accomplished by the marker-based
methods in ARToolKit, which is fast to execute and easy to
implement; however, it is little robust to occlusion and unsta-
ble to render the design model like the phenomenon of jitter.
Compared to ARToolkit, the Vuforia7 AR tracking library
enables extended and robust tracking [129]. For more details
about AR/MR tracking, please refer to more comprehensive
surveys [11, 16, 61, 136, 137].
From the graphs (Figs.8 and 9), the data showed that
although the marker-based tracking is still popular, the ten-
dency of markless tracking is on the rise and the marker-
based is in decline. We think that there is a relationship
between the tendency and lots of AR/MR SDK/platforms,
which can facilitate AR/MR-based co-design, such as,
ARTag, Unity3D, Unreal, Wikitude, ArUco, ARKit, and
ARCore. More specifically, the marker-less tracking gets
more and more attention; for instance, ARKit, ARCore, and
simultaneous localization and mapping (SLAM).
3.1.2 Display technology
Four choices are available for the display technology in an
AR/MR-based co-design system: video HMD, optical see-
through HMD, and projector-based and mobile devices [15].
7 https ://devel oper.vufor ia.com.
Engineering with Computers
1 3
Video HMD is the closest to VR, where the virtual environ-
ment is replaced by a video feed of reality and AR content is
overlaid upon the image target. Optical see-through HMDs
use optical combiners, so that designers can see the physical
world and simultaneously look at the virtual world. The third
way is to project the AR overlay onto real objects themselves
resulting in projective displays. The last one is to augment
the co-design environment using smartphones and tablets.
However, currently available video and optical HMD devices
are often relatively heavy and cumbersome, and may not be
suitable for long-term use in the co-design process where
they may cause certain discomfort, eye strain, and fatigue.
The graph (Fig.4) illustrates the tendency of publications
over time. At the same time, the types of displays used in
AR/MR co-design also become various, and always keep a
relative stable. Moreover, although the HMD-based display
takes up the largest proportion (see Fig.6), it keeps a steady
trend after 2010 (see Fig.7). Certainly, the different types
of displays suit different tasks of AR/MR co-design. We
think that the HMD-based interfaces would have tremen-
dous potential with an increasing number of light-weight and
comfortable VR/AR/MR devices available (e.g., HTC Vive
Pro, HoloLens2, DAQRI Smart Helmet, and Meta2). Fur-
thermore, the mobile devices (smartphone and tablet) also
have great potential in AR/MR mobile co-design platform.
3.1.3 Rendering
For AR/MR-based co-design system, rendering should be
fast enough, so that the dynamic error and lag are negligi-
ble, and the design model rendered with high photo-reality
[133]. From Table2, we can know that the virtual 3D design
model can be rendered by OpenGL,8 or VRML, OpenScen-
eGraph9 (OSG) in the clients’ workspace, and the design
model data are created from an STL model via extraction
function of a CAD tool [13, 72], such as SolidWorks, or
Opencascade10 (see Table2). Certainly, in the visualization-
based co-design system, the 3D design model can also be
rendered by these open-source libraries. Importantly, the
related rendering technologies have laid a solid foundation
for AR/MR co-design in the industry to some extent with
more and more enterprises paying attention to this field, such
as PTC, Autodesk, HoloLens, DAQRI, etc.
3.1.4 AR/MR‑based co‑design features
With the help of computer technology, network technology,
information technology (IT) and VR/AR/MR technology,
AR/MR-based co-design inherits the basic idea of traditional
co-design and concurrent design, and other advanced design
methods. Considering these technical characteristics, AR/
MR-based co-design has the features of the same aim, semi-
immersion, collaboration, creativity, and interactivity.
3.1.5 Same aim
During the AR/MR-based co-design process, designers have
the same design aim, even if they undertake different design
tasks and are co-located or dispersed geographically, or even
come from different disciplines. The co-design contexts are
consistent for sharing the design info and workspace to the
same extent.
3.1.6 Semi‑immersion
An AR/MR workspace provides a semi-immersive design
environment which contributes to inspiring the designers
and awakening innovation. The designers can simultane-
ously see both physical space surroundings and the virtual
world through the AR/MR displays, and thus, they can col-
laborate taking full advantage of the real objects surrounding
them as communication/interaction tools [13]. There is no
doubt that the AR/MR semi-immersive co-design environ-
ment provides a perfect platform for users to be aware of
their collaborators, which can aid understanding of other
designers’ activities and improve co-design efficiency and
quality.
3.1.7 Collaboration
To keep the AR/MR-based co-design process consistent with
the ultimate design goal, there must be close collaboration
and reasonable handling of constraints of the design sub-
tasks/clients. In addition, the co-design is the process of not
only participation in groups but also negotiating the resolu-
tion of design issues with dialogs. Moreover, different stake-
holders will have different ideas, objectives, requirements,
and priorities, which make the conflicts in the co-design
process unavoidable, so the collaboration is the basic feature
of AR/MR-based co-design.
3.1.8 Creativity andinteractivity
The AR/MR-based co-design system can provide a perfect
platform to create and show the blueprint of virtual con-
ceptual products in a natural and intuitive manner. That is,
the designer can feel immersed in the virtual-real fusion
environment to design products, which lets the designers
real-timely consider the spatial restrictions and innovate the
virtual concepts on the basis of the design context. Designers
will have a better understanding of the scale of the product
8 https ://www.openg l.org.
9 http://trac.opens ceneg raph.org/proje cts/osg.
10 https ://www.openc ascad e.com.
Engineering with Computers
1 3
models by comparing their sizes with surrounding real
objects. This is more consistent with the way which we per-
ceive the scale of objects than in VR and traditional Desk-
topCAD systems using the numerical dimensions. In addi-
tion, with the help of AR/MR technology, the information
about the surrounding real world of the designer becomes
interactive and manipulable.
3.2 AR‑based co‑design modeling technology
Currently, some research works have been reported in the
area of solid modeling in the AR-based environment [13].
In simultaneous co-design, the modeling process can be
roughly grouped into (1) model creation, (2) feature modifi-
cation, and (3) design model synchronization.
It is vital to avoid confusion when the CAD model is
modified in the clients’ workspace. An ‘editing right’ mech-
anism can be adapted to schedule the co-design activities.
During the design process, only one designer has owner-
ship and can edit the design model. At the same time, some
manipulations such as rotation, zoom, and translation in one
client view will not collide with the other clients’ manipu-
lations. Although each client has their own views, the vir-
tual design product models are kept consistent in all clients/
designers’ views. More details about the co-design modeling
are given in refs [13, 14, 72, 98], such as feature creation and
location, feature modification, model synchronization, and
model representation mechanism.
4 Current research issues
Based on the comprehensive analysis of the current cutting
edge AR-based co-design technology developments, there
are a number of current research directions that could be
explored.
4.1 Evaluation ofeectiveness andusability
The evaluation studies of AR/MR applications can be
roughly classified into two types, effectiveness and usabil-
ity, and there is a relationship between these two—the more
usable a system is, the more effective it should be. Evalua-
tion of effectiveness involves measuring the system capabil-
ity to get the desired result for a certain task or activity, e.g.,
shortening of co-design performance time, and improving
productivity, etc. Kiyoshi etal. [63] described two experi-
ments for verifying the effectiveness of a shared AR environ-
ment compared with a shared VR environment. Grasset etal.
[132] conducted a study on the impact of the efficiency of
collaboration in AR and VR environments. Experiments pro-
vided a better understanding of how to design interfaces for
transitional collaboration between AR and VR viewpoints.
In addition, Ko and Chang [123] presented comparative
studies on AR with the traditional methods to demonstrate
the usefulness and intuitiveness of AR design conditions,
which can decrease users’ cognitive and functional load. For
example, Wang etal. [2] demonstrated the effectiveness of
MR design systems compared with the traditional one.
Usability evaluation concerns measuring the ease of use
and learnability of AR-based co-design systems based on
needs analysis, user interviews, expert evaluations [11], user
surveys, performance measures, and behavior measures, etc.,
thus usability studies can facilitate identifying system flaws
at earlier stages of development. To prove the usability of
shared virtual space in co-design, Gutwin etal. [138] con-
ducted an experiment of whether awareness can improve the
usability of the shared workspace. They provided empirical
evidence and underlying reasons for supporting workspace
awareness.
For the effectiveness evaluation and usability evaluation,
there is a lot of difference among researchers with respect to
various interaction approaches and the complexity of prod-
uct design models, etc. However, factors that have an effect
on the effectiveness and usability of an AR-based co-design
system are still unclear.
Another issue is that although some AR-based co-design
systems have been implemented, few of them have been
evaluated with rigorous user studies [11, 61]. From Table2,
as the AR technology has become more mature, we find
research priorities are gradually shifting to the effectiveness
and usability of the applications and HCI. Thus, more formal
user studies of interaction and effectiveness/usability of co-
design system need to be conducted in the future, especially
in areas such as error tolerance, individual differences, prag-
matic and holistic design, etc. In addition, general evaluation
architecture and methods need to be investigated to evalu-
ate the effectiveness and usability of AR-based co-design
systems.
4.2 Interaction techniques
In the AR/MR-based design workspace, it is quite clear that
interaction techniques have a great influence on the effec-
tiveness, intuitiveness, and naturalness of creating, modify-
ing, and manipulating virtual design product models. There-
fore, the interaction technique is another important research
focus for AR/MR-based modeling, and researchers need to
attempt to create new design tools and technologies facilitat-
ing alternative visualization and representation.
To perform the construction of accurate design models,
modification and manipulation, there are some interactive
tools and methods that have been successfully developed to
aid design activities. For example, Ong etal. [13, 14, 72, 97,
98] developed a virtual interaction panel (VirIP) for control-
ling AR-based co-design systems. This employed a virtual
Engineering with Computers
1 3
stylus rendered on an ARToolkit11 marker to interact with
the shape control points (SCP) and buttons on virtual panels,
as shown in Fig.10. This tool is composed of a context-
sensitive VirIP, an interactive cursor, and a feature selection
method. The interaction technique provides a 3D/2D grid-
and-snap mode which allows designers to locate features,
and to create the desired 3D spatial coordinates and sketch,
etc.
Friendly concise user interfaces and efficient interaction
instruments play a key role in AR/MR-based co-design.
Physical-to-physical interaction between designers is also
significant. During the AR/MR-based co-design process,
designers can observe the design virtual model from dif-
ferent perspectives. The previous research has shown how
speech recognition can be used to retrieve the co-design
history document and carry out some simple manipulations
such as zooming in and out of the design model [72]. In
simultaneous co-design, designers can also select and mod-
ify the features of the design model. In the visualization-
based co-design system, some common manipulations such
as rotation, zoom, and translation in one client view will
not collide with the manipulations in the other client views.
These manipulations and selections of models can also be
accomplished by a virtual beam [47] and personal interac-
tion panel (PIP) [51, 83, 139].
Moreover, Budhiraja etal. [140] explored novel interac-
tion techniques that can take advantage of having both HMD
and handheld displays (HHD), which have been adopted by
DualCAD [20]. More importantly, researchers and practi-
tioners also can lots of interaction techniques from several
successful AR/MR design platform, such as a container and
parallel objects in Spacetime [73], a handheld reference
plane in MobiSweep [74], the “team first” ideation tools
based on a concept component hierarchy and collaborative
design explorer in Co-3Deator [62], the sketch-and-inflate
scheme based on a tangible MR interaction in Window-
Shaping [75], a self-contained 3D input in TMotion [76],
and a path viewer in skWiki [77], as shown in Fig.11.
4.3 Empathic computing12
Empathic computing is a research field exploring how tech-
nology like AR/MR and VR can be used to help increase
empathy or create deeper shared understanding between
designers [86, 141]. This requires a combination of rich
natural collaboration, capturing the user experience and
Fig. 10 VirIP [13]
Fig. 11 Useful interaction tech-
niques for AR/MR co-design
[62, 73, 74, 77]
11 http://artoo lkit.org/.12 http://www.empat hicco mputi ng.org/.
Engineering with Computers
1 3
surroundings, and being able to implicitly understand user
emotion and context [86], as illustrated in Fig.12. It can be
grouped into three major parts: measuring, understanding,
and sharing empathy based on the role of empathy at differ-
ent stages of co-design.
During the AR/MR-based co-design process, empathy
plays an important role in facilitating understanding design-
ers’ awareness, behavioral and emotions, and aiding the
understanding of collaborators activities and their mental
states. That is, designers can share the augmented design
workspace and increase empathy with each other, so it can
overcome two main limitations of separating task and com-
munication space, and lack of supporting for the same com-
munication cues used in face-to-face conversation. Wang
etal. [23] proposed a computer-mediated remote co-design
system, TeleAR which can enhance the distributed cognition
among remote designers, and they further investigated how
the system affects designers’ communication and collabora-
tion, focusing on distributed cognition and mutual aware-
ness. Gupta etal. [85] developed a remote collaboration
system, which showed that eye tracking cues can improve
efficiency and the benefit of being aware of the local col-
laborators’ focus of attention in communication. Katsutoshi
etal. [48] described the concept of Empathy Glasses that can
enable a user to see, and hear from another user’s perspec-
tive, and showed that it can enhance collaborative systems.
Researchers [23, 48, 63, 8587] concluded that designers’
empathy (e.g., facial expressions, gaze directions, gestures,
etc.) can enhance the common ground awareness and natural
interpersonal interaction for efficient collaboration.
Actually, as more and more researchers focus on this
field, we can easily measure collaborators’ physiological
signals in real-time using the Empatica E-4 wristband,13 or
similar tools, and then transmit them to others designers to
help understand one other. However, the mechanisms under-
lying such improvement remain unclear and there is little
related progress on that research [67, 142]. More research
should be done to identify whether/how the empathy of col-
laborators requires unconscious processing and conscious
understanding to influence emotional contagion on their
emotions, actions, and perceptions of design performance
[67, 143, 144]. Particularly, future related research might be
extremely valuable to be further explored under the remote
AR/MR-based co-design environment.
4.4 Mobile augmented reality (MAR)‑based
co‑design
As smartphones/tablets have become more popular and
powerful, HHDs have become one of the most promising
platforms for AR applications and mobile augmented reality
(MAR)-based co-design has become possible. A key dif-
ferentiating factor between HMD and HHDs is the easier
access cost to information of HHDs. Modern smartphones
and tablets combine fast CPUs, high-resolution cameras
and sensors such as GPS, compass, and gyroscopes. They
are an ideal platform for MAR-based co-design. Thus, the
system should be compatible with MAR systems which can
allow outdoor and indoor designers to access and manage
co-design information [145, 146]. Rahul and Seokhee etal.
[140, 147] have investigated novel interaction techniques
for HHD that provides intuitive 2D/3D interactive func-
tions, such as multi-layered visualization, situation adap-
tive visualization management, motion-based gesture inter-
action, etc. Leman etal. [129] presented a new marker- and
MAR-based co-design platform to understand a designer’s
design reasoning and communication activities. Alexandre
etal. [20]. introduced the DualCAD system that explores
taking advantage of HHD, HMD, and DesktopCAD closely
integrated into one AR-CAD system. Leman and Süheyla
[148] studied co-designers’ communication and interaction
when they are working with MAR. The results showed that
the MAR technology can improve co-design activities. The
portable and ubiquitous nature of MAR makes it ideal for
co-design processes.
Recently, Huo etal. [75] using the Google Tango device
proposed a tangible MR interaction metaphor: Window-
Shaping, which allows for the dimensionally consistent,
visually coherent insitu design of 3D shapes on and around
physical objects by the sketch-and-inflate scheme. They
demonstrated that the presented sketch-based 3D mod-
eling metaphor has the expressiveness, effectiveness, and
potential. Ramanujan etal. [74] developed spatial design
ideation: MobiSweep, using a smartphone as a reference
plane for design 3D sweep surfaces. Moreover, Yoon etal.
[76]. presented a self-contained 3D input: TMotion, which
can enable spatial interactions around a smartphone using a
Fig. 12 Empathic computing [86, 141]
13 https ://www.empat ica.com/e4-wrist band.
Engineering with Computers
1 3
magnetic tracking technique. The exploring results showed
that the emerging MAR devices (e.g., smartphone and tablet)
have great potential scalability and practicability in design.
Importantly, we think MAR devices combining ARCore,
ARKit, and SLAM will increasingly empower and drive the
perfection of the AR/MR-based co-system.
4.5 TUIs
The AR-based co-design system should be faster, more accu-
rate and concise, etc. For instance, the display can provide
additional and job-critical information easy to access and
understand. Moreover, it should be intuitive, ergonomic,
easy to use, as well as learn. There is a trend towards TUIs
supporting spatial 3D interaction techniques, and bimanual
interaction is alternative methods for interacting with the 3D
design models compared with the conventional keyboard and
mouse [149]. The TUI builds upon our dexterity by embody-
ing digital information in physical space, and TUIs expand
the affordances of physical objects, surfaces, and spaces, so
they can support direct engagement with the digital world
[59, 150]. There are a number of important benefits and rela-
tively little training needed to integrate TUIs into co-design
modeling systems. In addition, the systems can provide a
very intuitive method for viewing 3D design information,
but also support for modifying the design models. In the
AR/MR environment, there is a close connection between
virtual and physical objects. This inspires a TUI design
direction making full use of the immediacy and familiarity
of daily physical objects to manipulate virtual objects [149].
Therefore, the potential to integrate TUIs as an interface
for natural remote and co-located co-design should be more
explored [3]. Radical Atoms is the future material that is a
vision for the future of human-material interaction, and we
can interact directly with it. Thus, we, no longer, think of
designing the interface, but rather of Material User Inter-
face (MUI), as shown in Fig.13 [150].This can provide an
immersive AR-based co-design environment and a conveni-
ent way for ordinary designers unfamiliar with professional
theory knowledge of AR or programming skills.
5 Future trends
Although much progress in AR//MR has been made in
recent years, AR/MR-based co-design in manufacturing is
still at an exploratory stage. In this section, we review poten-
tial research ideas to further enhance and improve the AR/
MR-based co-design systems. Based on the comprehensive
analysis and current research issues, researchers who want
to make some new works on AR/MR-based co-design in the
future should consider the following topics.
5.1 AR/MR‑based collaborative conceptual design
During conceptualization, the design is still developing and
the designers are mainly concerned with the rapid generation
of ideas. AR/MR-based co-design can offer the advantage
of concurrent contextualization of the product conceptual
design, and natural user interfaces creating 3D models, and
everyday objects being used as a medium for design in an
AR/MR workspace. Besides, it can allow co-designers to
Fig. 13 Iceberg metaphor (http://tangi ble.media .mit.edu/visio n/)—from a GUI to b TUI to c MUI [150]
Engineering with Computers
1 3
explore more concepts from users based on highly realistic
representations much earlier and more easily than with a
traditional system [151]. Some studies [152156] have been
reported that AR technology has the potential to improve the
conceptual design. Therefore, we think that there is still a
deal of valuable work on the collaborative conceptual design
using AR/MR to do.
5.2 Multimodal interaction
We inherently interact with the world in a multimodal way.
Multimodal HCI has pursued several decades to endow sys-
tems with similar capabilities, to provide more natural, pow-
erful, and compelling interactive experiences [157]. During
the co-modeling process, real-time and natural interaction is
one of the central features of AR-based co-design. Interac-
tion ways should be friendly, which can conveniently and
intuitively interact with the virtual model and operate real
objects in the shared-augmented design workspace. Multi-
modal interaction can provide a more natural style of HCI
allowing our developed flexible communicative skills to
interact with AR-based co-design systems. Using multi-
modal interaction in a system has proved to offer numerous
performance advantages [158]. It enables users to employ
more natural modalities of communication including voice,
gesture and eye tracking, etc. [159]. Ideally, multimodal
interaction can make full use of integrating each advantage,
and the weakness of one modal is compensated by the other,
achieving a better-combined usability in general.
With the rapid advance in powerful devices and afford-
able sensors, it is very promising for interaction that the
Leap Motion14 and RGB-Depth cameras such as Microsoft
Kinetic15 and Softkinetic16 can be used to capture gestures
[160] and workspace dense 3D maps [161, 162]. Gaze inter-
action is another method for fast, natural, easy, and intui-
tive interaction [163, 164], using tools like the Pupil Labs,17
Tobii,18 and EyeTracking,19 open-source eye-tracker plat-
form. Chatterjee etal. [165] proved that the combinative
input methods of gaze and gesture can have a synergy of
rapid, precision, and expressive interaction. Deng etal. [129]
investigated the impact of multimodal interaction using gaze
and gesture manipulating 3D virtual objects. Song etal.
[166] developed a multimodal interaction interface GaFinC
for manipulating the 3D model in a CAD system. However,
there are few AR/MR systems exploring how it might be
used for co-design.
In addition, during the AR/MR-based co-design, a
designer would be in the AR workspace, and another would
be in the remote VR site which there is much work left to do,
not just gaze, gestures, motion capture, etc. In such cases,
the interaction techniques/processes in the VR environment
need to be considered for making full use of VR and AR
advantages.
5.3 Integration ofcloud‑based co‑design (CBCD)
andAR‑based co‑design
Today’s highly competitive global marketplace is quickly
shaping many designs and manufacturing companies. In
this context, CBCD is developing rapidly. It refers to a
networked design model that exploits cloud computing
technology to support cloud-based engineering design ser-
vices in distributed and collaborative environments [167].
CBCD and AR-based co-design are both complementary
in many perspectives. In the remote AR-based co-design,
it is important to establish a cloud-based environment for
effective information interchange, allowing for enhanced
information flow management that can significantly improve
AR-based co-design productivity. Moreover, as stated in refs
[37, 167], CBCD has the potential to allow the distributed
design team to conduct co-design activities more efficiently.
From this aspect, it is imperative to integrate CBCD with
AR-based co-design. The integral system can efficiently sup-
port interrelated design activities, and quickly share infor-
mation between designers and systems to reduce product
design lifecycle costs and improve product design quality
and efficiency.
5.4 Design review (DR)
It is critical that product DR should be finished in an AR/
MR-based co-design system. In general, the process of DR is
usually very tedious and time-consuming, which is becom-
ing very inefficient, especially when some members are
involved from different geographical regions [21, 43]. Ide-
ally, the DR committee could assess their own evaluations
real time in parallel to one another [43, 115]. In the past sev-
eral decades, several studies [102, 168] have indicated that
AR/MR environments are suitable to address collaborative
DR. Furthermore, it is significant to allow multiple designers
to annotate at the same time during the DR process [115].
The AR/MR technology can facilitate problem-solving,
the quantity of work, and design error detection. Hence, it
should be considered in the AR/MR-based remote co-design
and co-located co-design, which can dramatically improve
the DR efficiency for many key characteristics.
14 https ://www.leapm otion .com/.
15 https ://devel oper.micro soft.com/en-us/windo ws/kinec t.
16 https ://www.softk ineti c.com/.
17 https ://pupil -labs.com/pupil /.
18 https ://www.tobii .com/.
19 http://www.eyetr ackin g.com/.
Engineering with Computers
1 3
5.5 Security andinteroperability
As collaborators move AR/MR-based co-design to Internet-
based collaboration, security has to be carefully considered.
Whilst many security solutions offered by the current co-
design systems will deal with the transmission level, they
can also benefit from taking additional security factors into
account with their design data [39]. However, there is lit-
tle consideration in the current AR-based co-design system.
Interoperability between AR/MR-based co-design systems
and Product Data Management (PDM) systems must be
achieved in the future. Therefore, AR/MR-based co-design
system should successfully handle design data importing
and exporting between the major CAD applications, even
directly make use of legacy CAD data.
5.6 More powerful CAD functions
In the future, there is no doubt that the CAD functions of
AR/MR-based co-design system have to be more powerful.
Researchers will be devoted to developing the AR-based
co-design system which enables users to intuitively create,
naturally modify, easily manipulate 3D design models, but
it is also time for visualization, rendering, and analyzed
using finite-element analysis (FEA) [169, 170] in the AR/
MR environment. In addition, the functions of physics-
based modeling and haptic interaction have potential. On
one hand, physical modeling can dramatically improve the
designers’ sense of immersion and interactivity, particularly
in requiring intensive levels of manipulation on the basis of
3D design model physical properties [171]. On the other
hand, force cues provided by haptics technology can enhance
designers feel and help better understand the design model
by supplementing visual and auditory cues and having an
improved sense [172, 173]. When the AR/MR technology
progresses, the cost of computing and visualization will
decrease, and their capabilities will increase. Hence, it will
soon be possible to integrate faster and more robust algo-
rithms into AR/MR-based co-design systems that handle
more complex design models while incorporating physical
appropriate behavior.
6 Conclusions anddiscussion
For several decades, AR/MR has attracted an increasing
number of researchers in many fields, such as manufac-
turing, games, industrial design, medicine, education, etc.
There are lots of advantages from applications of AR/MR
techniques in the design domain. We found that co-design
in AR/MR environments provides a perfect platform for
designers to design products quickly and efficiently, and
exploit potentially designed blueprints. The AR/MR-based
co-design system provides a perfect tool allowing design-
ers to cooperatively design innovative virtual products in
a virtual-real fusion environment. Taking advantage of the
context during the AR/MR-based co-design process makes
designers concentrate on the design itself and its relationship
with the external environment.
From the comprehensive survey, we found that lots of
studies focused on HCI from different aspects using user
study, such as preference, ease of use, performance, intui-
tiveness, naturalness, etc. Therefore, the evaluation of usa-
bility/effectiveness and interaction techniques is popular by
conducting user study. In addition, VR/AR/MR is relatively
personalized platforms that pave the way for capturing
designer’ data (e.g., head/eye gaze, emotional cues, facial
expressions, and heart rate). The co-design task environ-
ment can also be captured by spatial mapping and imaging
technology, providing the co-design context [86]. Making
the most of all this information, to establish an empathic
co-design system would be possible. It will facilitate under-
standing designers’ awareness, behavioral, and emotions,
and maintain the co-design grounding of what is being dis-
cussed. Importantly, researchers and practitioners in AR/
MR-based remote co-design can learn many implications
from some studies [85, 87, 88, 124128, 174, 175] on AR/
MR remote collaboration for physical tasks. More specifi-
cally, they include five aspects: (1) sharing tasks’ workspace
using video and audio with partners is an indispensable ele-
ment. (2) sharing real-time space capture by 360 video and
3D reconstruction attracts more and more attention. (3) It is
necessary to support nonverbal communication cues, such
as head/eye gaze, gestures, sketches, avatar, etc. (4) with
the rapid development of machine/deep learning, AI, 5G,
could/fog computing, and smart sensors, it is easy to obtain
and share the designers’ implicit cues. (5) MR co-design
system would take advantage of VR and AR merits, and
pave the way for novel interaction. Moreover, AR/MR co-
design would create more value by integrating VR/AR/MR
into commonly used CAD systems, such as Solidworks,
Autodesk CAD, Creo, etc. It should note that the PTC Creo
is an active practitioner and facilitator in the field. In some
cases, the mobile AR/MR-based co-design has tremendous
potential by making use of handheld devices (smartphone
and tablet) as these devices become more popular and pow-
erful, supporting Internet of Things (IoT), ARKit, ARCore,
and SLAM.
In this paper, we report on AR/MR-based co-design
applications between 1990 and 2017. This is the first sur-
vey to provide a comprehensive review and critical com-
ments for the state-of-the-art AR/MR-based co-design. A
full and comprehensive academic map facilitates further
study for AR/MR-based co-design researchers all over the
world. This review begins with a description of traditional
co-design technologies, followed by detailing how AR/
Engineering with Computers
1 3
MR-based co-design systems are now being applied and
tested. Considering the functions of systems and roles of
designers in a co-design activity, the AR/MR-based co-
design systems can be roughly grouped into two types:
visualization-based co-design system and simultaneous
co-design system. In addition, these two types of systems
are usually developed using the client–server framework.
Most systems are visualization-based co-design. Moreo-
ver, we generalized the key features and technologies
of AR/MR-based co-design. We also conclude that cur-
rent imperatives and research directions. Many different
research directions have been investigated and extensively
discussed by lots of researchers. However, the research
directions of evaluating the system’s effectiveness and
usability, interaction techniques, HCI, and TUIs are hot
issues, because there are a relatively steady number of
papers per year on these topics. Recently, empathic com-
puting, multimodal interaction, and DR in the AR/MR-
based co-design are promising research directions. In addi-
tion, there is no doubt that the AR/MR-based co-design
system is a complex nonlinear dynamical network, and
every designer is a network node with the designers’ ideas,
and related manipulations as the network node attributes.
As a result, it is important to improve the robustness of
the co-design system by studying the dynamic behavior
of network systems. Moreover, future trends in the AR/
MR-based co-design are presented.
According to the analysis in this paper, it is still chal-
lenging to bring AR/MR-based co-design research from
laboratories to industry and widespread use. Fortunately,
more researchers are keeping a watchful eye on it, and it is
becoming easier than ever before to be involved in AR/MR-
based co-design research. We firmly believe that the related
studies on AR/MR-based co-design will take an essential
step in intelligent manufacturing.
Acknowledgements This research was sponsored by the civil aircraft
special project (MJZ-2017-G73). Besides, we would like to appreci-
ate the anonymous reviewers for their constructive suggestions for
enhancing this paper. Specifically, Mark Billinghurst has checked the
English of an early version, and his constructive comments are grate-
fully acknowledged which have helped the author to improve the paper.
Moreover, we thank Dechuan Han and Mengmeng Sun for filtering
and downloading papers. We also thank other schoolfellows, including
Jinping Chen, Qian Cheng, Li Zhang, Yongxing Chen, Zhuang Chang,
Yue Wang, and Xiaokun Zhang for their invaluable discussion, criti-
cism, and advice for further improving this paper.
References
1. Chandrasegaran SK, Ramani K, Sriram RD, Horváth I, Bernard
A, Harik RF etal (2013) The evolution, challenges, and future
of knowledge representation in product design systems. Comput
Aided Des 45(2):204–228
2. Wang X, Dunston PS (2011) Comparative effectiveness of mixed
reality-based virtual environments in collaborative design. IEEE
Trans Syst Man Cybern Part C (Appl Rev) 41(3):284–296
3. Gu N, Mi JK, Maher ML (2011) Technological advancements
in synchronous collaboration: the effect of 3d virtual worlds and
tangible user interfaces on architectural design. Autom Constr
20(3):270–278
4. Karakaya AF, Demirkan H (2015) Collaborative digital environ-
ments to enhance the creativity of designers. Comput Hum Behav
42:176–186
5. Mitchell V, Ross T, May A, Sims R, Parker C (2016) Empirical
investigation of the impact of using co-design methods when
generating proposals for sustainable travel solutions. Codesign
12(4):205–220
6. Tseng MM, Du X (1998) Design by customers for mass customi-
zation products. CIRP Ann Manuf Technol 47(1):103–106
7. Porter ME, Heppelmann JE (2017) Why every organization needs
an augmented reality strategy. Harv Bus Rev 95(6):46–57
8. Mohr P, Mandl D, Tatzgern M, Veas E, Schmalstieg D, Kalkofen
D. (2017) Retargeting video tutorials showing tools with surface
contact to augmented reality. In: Proceedings of the 2017 CHI
Conference on Human Factors in Computing Systems. ACM, pp
6547–6558
9. Lukosch S, Billinghurst M, Alem L, Kiyokawa K (2015) Col-
laboration in Augmented Reality. Comput Support Cooperative
Work 24:515
10. See ZS, Billinghurst M, Rengganaten V, Soo S (2016) Medical
learning murmurs simulation with mobile audible augmented
reality. In: SIGGRAPH ASIA 2016 mobile graphics and interac-
tive applications, ACM, p. 4
11. Wang X, Ong SK, Nee AYC (2016) A comprehensive survey of
augmented reality assembly research. Adv Manuf 4(1):1–22
12. Henderson S, Feiner S (2011) Exploring the benefits of aug-
mented reality documentation for maintenance and repair. IEEE
Trans Visual Comput Graph17(10):1355
13. Shen Y, Ong SK, Nee AYC (2010) Augmented reality for collab-
orative product design and development. Des Stud 31(2):118–145
14. Ong SK, Shen Y (2009) A mixed reality environment for
collaborative product design and development. CIRP Ann
58(1):139–142
15. Nee AY, Ong SK, Chryssolouris G, Mourtzis D (2012) Aug-
mented reality applications in design and manufacturing. CIRP
Ann Manuf Technol 61(2):657–679
16. Ong SK, Yuan ML, Nee AYC (2008) Augmented real-
ity applications in manufacturing: a survey. Int J Prod Res
46(10):2707–2742
17. Huang TC, Chen CC, Chou YW (2016) Animating eco-educa-
tion: to see, feel, and discover in an augmented reality-based
experiential learning environment. Comput Educ 96:72–82
18. Wu HK, Lee WY, Chang HY, Liang JC (2013) Current status,
opportunities and challenges of augmented reality in education.
Comput Educ 62:41–49
19. Arroyave-Tobón S, Osorio-Gómez G, Cardona-McCormick JF
(2015) Air-modelling: a tool for gesture-based solid modelling in
context during early design stages in AR environments. Comput
Ind 66:73–81
20. Millette A, McGuffin MJ (2016) DualCAD: integrating aug-
mented reality with a desktop GUI and smartphone interaction.
In: Mixed and augmented reality (ISMAR-Adjunct), 2016 IEEE
International Symposium, IEEE, pp 21–26
21. Uva AE, Cristiano S, Fiorentino M, Monno G (2010) Distrib-
uted design review using tangible augmented technical drawings.
Comput Aided Des 42(5):364–372
22. Januszka M, Moczulski W (2011) Augmented reality system for
aiding engineering design process of machinery systems. J Syst
Sci Syst Eng 20(3):294
Engineering with Computers
1 3
23. Wang X, Love PE, Kim MJ, Wang W (2014) Mutual aware-
ness in collaborative design: an augmented reality integrated
telepresence system. Comput Ind 65(2):314–324
24. Chen Y, Feng P, Lin Z (2005) A genetics-based approach for
the principle conceptual design of mechanical products. Int J
Adv Manuf Technol 27(3–4):225–233
25. Wang L, Shen W, Xie H, Neelamkavil J, Pardasani A (2002)
Collaborative conceptual design—state of the art and future
trends. Comput Aided Des 34(13):981–996
26. Li WD, Ong SK, Fuh JYH, Wong YS, Lu YQ, Nee AYC (2004)
Feature-based design in a distributed and collaborative envi-
ronment. Comput Aided Des 36(9):775–797
27. Chu CH, Cheng CY, Wu CW (2006) Applications of the web-
based collaborative visualization in distributed product devel-
opment. Comput Ind 57(3):272–282
28. Yin X, Wu Y, Liu Q, Wang S (2015) A web-based, collabora-
tive modeling, simulation, and parallel computing environment
for electromechanical systems. Adv Mech Eng 7(3):1–6
29. Qiang L, Zhang YF, Nee AYC (2001) A distributive and col-
laborative concurrent product design system through the www/
internet. Int J Adv Manuf Technol 17(5):315–322
30. Fan H, Liu Y, Hu B, Ye X (2016) Multidomain model integra-
tion for online collaborative system design and detailed design
of complex mechatronic systems. IEEE Trans Autom Sci Eng
13(2):709–728
31. Wang K, Takahashi A (2012) Semantic web based innovative
design knowledge modeling for collaborative design. Expert
Syst Appl 39(5):5616–5624
32. Monostori L, Váncza J, Kumara SRT (2006) Agent-based
systems for manufacturing. CIRP Ann Manuf Technol
55(2):697–720
33. Wang JX, Tang MX, Song LN, Jiang SQ (2009) Design and
implementation of an agent-based collaborative product design
system. Comput Ind 60(7):520–535
34. Shen W, Maturana F, Norrie DH (2000) Metamorph ii: an
agent-based architecture for distributed intelligent design and
manufacturing. J Intell Manuf 11(3):237–251
35. Chen Y, Liu ZL, Xie YB (2014) A multi-agent-based approach
for conceptual design synthesis of multi-disciplinary systems.
Int J Prod Res 52(6):1681–1694
36. Shen W, Norrie DH, Barthés JPA (2000) Multi-agent systems
for concurrent intelligent design and manufacturing, vol 47, no
11. CRC Press, Boca Raton, pp 1253–1254
37. Wu D, Rosen DW, Wang L, Schaefer D (2015) Cloud-based
design and manufacturing: a new paradigm in digital manufac-
turing and design innovation. Comput Aided Des 59:1
38. Schaefer D (2014) Cloud-based design and manufacturing
(CBDM). Springer, Berlin
39. Li WD, Lu WF, Fuh JYH, Wong YS (2005) Collaborative com-
puter-aided design-research and development status. Comput
Aided Des 37(9):931–940
40. Fuh JYH, Li WD (2005) Advances in collaborative cad: the-
state-of-the-art. Comput Aided Des 37(5):571–581
41. Shen W, Wang L (2003) Web-based and agent-based
approaches for collaborative product design: an overview. Int
J Comput Appl Technol 16(2–3):103–112
42. French D, Jensen G, Walker SS, Madsen P, Red E (2013)
Emerging design methods and tools in collaborative product
development. J Comput Inf Sci Eng 13(3):031001
43. Huang GQ (2002) Web-based support for collaborative product
design review. Comput Ind 48(1):71–88
44. Ren L, Zhang L, Tao F, Zhang XL, Zhang Y, Zhang Y (2012)
A methodology towards virtualisation-based high performance
simulation platform supporting multidisciplinary design of
complex products. Enterp Inf Syst 6(3):267–290
45. Gao S, Wan H, Peng Q (2000) An approach to solid mod-
eling in a semi-immersive virtual environment. Comput
Graph24(2):191–202
46. Kiyokawa K, Billinghurst M, Hayes SE, Gupta A, Sannohe Y,
Kato H (2002) Communication behaviors of co-located users in
collaborative AR interfaces. In: mixed and augmented reality,
2002. ISMAR 2002. Proceedings. International symposium on
IEEE, pp 139–148
47. Rekimoto J (1996) Transvision: a hand-held augmented real-
ity system for collaborative design. Proc Virtual Syst Multimed
96:18–20
48. Masai K, Kunze K, Billinghurst M (2016) Empathy glasses. In:
Proceedings of the 2016 CHI Conference Extended abstracts on
human factors in computing systems. ACM, pp 1257–1263
49. Kouprie M, Visser FS (2009) A framework for empathy
in design: stepping into and out of the user’s life. J Eng Des
20(5):437–448
50. Yuan S, Dong H (2014) Empathy building through co-design. In:
International conference on universal access in human-computer
interaction. Springer, Cham, pp 85–91
51. Schmalstieg D, Fuhrmann A, Gervautz M (1998) “studierstube”:
an environment for collaboration in augmented reality. Virtual
Real 3(1):37–48
52. Blanding R, Turkiyyah GECAD (2007) A prototype screen-based
VR solid modeling environment incorporating tangible deform-
able models. Comput Aided Des Appl 4(5):595–605
53. Jezernik A, Hren G (2003) A solution to integrate computer-
aided design (CAD) and virtual reality (VR) databases in
design and manufacturing processes. Int J Adv Manuf Technol
22(11–12):768–774
54. Ma W, Zhong Y, Tso SK, Zhou T (2004) A hierarchically struc-
tured and constraint-based data model for intuitive and precise
solid modeling in a virtual reality environment. Comput Aided
Des 36(10):903–928
55. Zhong Y, Ma W, Shirinzadeh B (2005) A methodology for solid
modelling in a virtual reality environment. Robot Comput Integr
Manuf 21(6):528–549
56. Wang QH, Li JR, Wu BL, Zhang XM (2010) Live parametric
design modifications in cad-linked virtual environment. Int J Adv
Manuf Technol 50(9–12):859–869
57. Kan HY, Duffy VG, Su CJ (2001) An internet virtual reality col-
laborative environment for effective product design. Comput Ind
45(2):197–213
58. Mahdjoub M, Monticolo D, Gomes S, Sagot JC (2010) A col-
laborative design for usability approach supported by virtual real-
ity and a multi-agent system embedded in a plm environment.
Comput Aided Des 42(5):402–413
59. Ishii H (1997) Tangible bits: towards seamless interfaces between
people, bits and atoms. In: ACM Sigchi conference on human
factors in computing systems. ACM, pp 234–241
60. Dunston P, Wang X, Billinghurst M, Hampson B (2003) Mixed
reality benefits for design perception. In: 19th International Sym-
posium on automation and robotics construction (ISARC 2002),
pp 191–196
61. Zhou F, Duh BL, Billinghurst M (2008) Trends in augmented
reality tracking, interaction and display: a review of ten years of
ISMAR. In: IEEE/ACM International Symposium on mixed and
augmented reality. IEEE, pp 193–202
62. Piya C, Vinayak, Chandrasegaran S, Elmqvist N, Ramani K
(2017) Co-3Deator: ateam-first collaborative 3D design ideation
tool. In: Chi Conference on human factors in computing systems.
ACM, pp 6581–6592
63. Kiyokawa K, Iwasa H, Takemura H, Yokoya N (1998) Collabo-
rative immersive workspace through. In: Proceedings of SPIE
International Symposium on intelligent systems and advanced
manufacturing, Vol. 3517, pp 2–13
Engineering with Computers
1 3
64. Billinghurst M, Kato H (2002) Collaborative augmented reality.
Commun ACM 45(7):64–70
65. Tait M, Billinghurst M (2015) The effect of view independence
in a collaborative AR system. Comput Support Coop Work
24(6):563–589
66. Tan CSS, Luyten K, Jan VDB, Schöning J, Coninx K (2014) The
role of physiological cues during remote collaboration. Presence
23(1):90–107
67. Xolocotzin Eligio U, Ainsworth SE, Crook CK (2012) Emotion
understanding and performance during computer-supported col-
laboration. Comput Hum Behav 28:2046
68. Lu CY, Shpitalni M, Gadh R (1999) Virtual and augmented
reality technologies for product realization. CIRP Ann Manuf
Technol 48(2):471–495
69. Wang X, Gu N, Marchant D (2008) An empirical study on
designers’ perceptions of augmented reality within an architec-
tural firm. Electron J Inf Technol Constr 13:536–552
70. Li JR, Tang C, Wang QH (2015) A proxy approach to integrate
heterogeneous CAD resources for distributed collaborative
design. Int J Comput Integr Manuf 28(6):593–606
71. Shen Y, Ong SK, Nee AYC (2006) A Framework for Multiple-
View Product Representation Using Augmented Reality. In:
International Conference on cyberworlds. IEEE Computer Soci-
ety, pp 157–164
72. Shen Y, Ong SK, Nee AYC (2008) Product information visuali-
zation and augmentation in collaborative design. Comput Aided
Des 40(9):963–974
73. Xia H, Herscher S, Perlin K, Wigdor D (2018) Spacetime: ena-
bling fluid individual and collaborative editing in virtual reality.
In: The 31st Annual ACM Symposium on user interface software
and technology. ACM, pp 853–866
74. Vinayak, Ramanujan D, Piya C, Ramani K (2016) MobiSweep:
exploring spatial design ideation using a smartphone as a hand-
held reference plane. Proceedings of the TEI’16: tenth interna-
tional conference on tangible, embedded, and embodied interac-
tion. ACM, pp 12–20
75. Huo K, Vinayak, Ramani K (2017) Window-Shaping: 3D design
ideation by creating on, borrowing from, and looking at the phys-
ical world. In: Eleventh International Conference on Tangible,
2017
76. Sang HY, Ke H, Ramani K (2016) TMotion: embedded 3D
mobile input using magnetic sensing technique. In: Tei 16: Tenth
International Conference on Tangible, 2016
77. Zhao Z, Badam S K, Chandrasegaran S, Park D G, Elmqvist N,
Kisselburgh L, Ramani K (2014) skWiki:a multimedia sketch-
ing system for collaborative creativity. In: Acm Conference on
Human Factors in Computing Systems, 2014
78. Dey A, Billinghurst M, Lindeman RW, Ii JES (2017) A system-
atic review of usability studies in augmented reality between
2005 and 2014. In: IEEE International Symposium on Mixed
and Augmented Reality, IEEE, pp 49–50
79. Ahlers KH, Kramer A, Breen DE, Chevalier PY, Crampton C,
Rose E, Tuceryan M, Whitaker RT, Greer D (1995) Distributed
augmented reality for collaborative design applications. Comput
Graph Forum 14(3):3–14
80. Kiyokawa K, Takemura H, Katayama Y, Iwasa H, Yokoya N
(1996) Vlego: a simple two-handed modeling environment based
on toy blocks. Proceedings of the ACM symposium on virtual
reality software and technology. ACM, pp 27–34
81. Kiyokawa K, Takemura H, Yokoya N (1999) Seamlessdesign:
a face-to-face collaborative virtual/augmented environment for
rapid prototyping of geometrically constrained 3-d objects. In:
Multimedia computing and systems, 1999. IEEE International
Conference on (Vol. 2, pp 447–453), IEEE
82. Schumann H, Burtescu S, Siering F (1998) Applying
augmented reality techniques in the field of interactive
collaborative design. 3D structure from multiple images of
large-scale environments. Springer, Berlin, pp 290–303
83. Schmalstieg D, Fuhrmann A, Hesina G, Gervautz M, Purgath-
ofer W (2002) The studierstube augmented reality project.
Presence Teleoperators Virtual Environ 11(1):33–54
84. Grønbæk K, Mogensen P, Rbk P (2001) Interactive room sup-
port for complex and distributed design projects. INTERACT,
pp 407–414
85. Gupta K, Lee GA, Billinghurst M (2016) Do you see what i
see? the effect of gaze tracking on task space remote collabora-
tion. IEEE Trans Visual Comput Graphics 22(11):2413–2422
86. Piumsomboon T, Lee Y, Lee GA, Dey A, Billinghurst M (2017)
Empathic mixed reality: sharing what you feel and interacting
with what you see. In: International Symposium on Ubiquitous
Virtual Reality, IEEE, pp 38–41
87. Lee Y, Masai K, Kai K, Sugimoto M, Billinghurst M (2017) A
remote collaboration system with empathy glasses. In: IEEE
International Symposium on mixed and augmented reality,
IEEE, pp 342–343
88. Gao L, Bai H, Lee G, Billinghurst M (2016) An oriented point-
cloud view for MR remote collaboration. In: SIGGRAPH
ASIA 2016 Mobile Graphics and Interactive Applications.
ACM, p 8
89. Regenbrecht HT, Wagner M, Baratoff G (2002) Magicmeeting:
a collaborative tangible augmented reality system. Virtual Real
6(3):151–166
90. Wagner MT, Wagner MT (2002) Interaction in a collabora-
tive augmented reality environment. In: CHI ‘02 Extended
Abstracts on human factors in computing systems, ACM, pp
504–505
91. Klinker G, Dutoit AH, Bauer M, Bayer J, Novak V, Matzke
D (2002) Fata Morgana “A Presentation System for Product
Design”. In: International Symposium on mixed and augmented
reality, IEEE Computer Society, pp 76
92. Liu P, Georganas ND, Boulanger P (2002) Designing real-time
vision based augmented reality environments for 3D collabora-
tive applications. Des Real, IEEE
93. Verlinden JC (2003) Development of a flexible augmented pro-
totyping system. UNION Agency–Science Press, pp 1–8
94. Hirakawa M, Koike S (2004) A collaborative augmented reality
system using transparent display. In: IEEE Sixth International
Symposium on multimedia software engineering, IEEE Com-
puter Society, pp 410–416
95. Penn A, Mottram C, Schieck AFG, Wittkämper M, Störring M,
Romell O, Strothmann A, Aish F (2004) Recent advances in
design and decision support systems in architecture and urban
planning. Springer, Dordrecht, pp 213–231
96. Broll W, Lindt I, Ohlenburg J, Wittkamper M, Yuan C, Novotny
T, Schieck AFG, Mottram C, Strothmann A (2004) Arthur: a col-
laborative augmented environment for architectural design and
urban planning. J Virtual Real Broadcast 1(1):1–10
97. Yuan ML, Ong SK, Nee AYC (2004) The virtual interaction
panel: an easy control tool in augmented reality systems. Comput
Anim Virtual Worlds 15(3–4):425–432
98. Shen Y, Ong SK, Nee AYC (2008) Collaborative design in 3D
space. In: ACM SIGGRAPH International Conference on vir-
tual-reality continuum and its applications in industry, ACM, p
29
99. Maher ML, Mi JK (2005) Do tangible user interfaces impact spa-
tial cognition in collaborative design?. In: International Confer-
ence on cooperative design, visualization, and engineering (vol
3675), Springer, pp 30–41
100. Kim MJ, Maher ML (2005) Comparison of designers using a
tangible user interface and a graphical user interface and the
impact on spatial cognition. In: International Workshop on
human behaviour in designing, pp 81–94
Engineering with Computers
1 3
101. Lee W, Park J (2006) Augmented foam: touchable and graspable
augmented reality for product design simulation. Bull Jpn Soc
Sci Des 52(6):17–26
102. Sidharta R, Oliver J, Sannier A (2006) Augmented reality tangi-
ble interface for distributed design review. In: computer graph-
ics, imaging and visualisation, 2006 International Conference on
IEEE, pp 464–470
103. Sakong K, Nam TJ (2006) Supporting telepresence by visual
and physical cues in distributed 3D collaborative design envi-
ronments. In: CHI’06 extended abstracts on human factors in
computing systems, ACM, pp 1283–1288
104. Wang X, Chen R (2009) An experimental study on collaborative
effectiveness of augmented reality potentials in urban design.
CoDesign 5(4):229–244
105. Wang X, Dunston PS (2006) Potential of augmented reality as
an assistant viewer for computer-aided drawing. J Comput Civil
Eng 20(6):437–441
106. Wang X, Dunston PS (2008) User perspectives on mixed real-
ity tabletop visualization for face-to-face collaborative design
review. Autom Constr 17(4):399–412
107. Wang X (2007) Exploring an innovative collaborative design
space through mixed reality boundaries. In: Computer supported
cooperative work in design, 2007. CSCWD 2007. 11th Interna-
tional Conference on IEEE, pp 264–269
108. Seichter H (2007) Augmented reality and tangible interfaces
in collaborative urban design. In: computer-aided architectural
design futures (CAADFutures) 2007, Springer, Dordrecht, pp
3–16
109. Santos P, Stork A, Gierlinger T, Pagani A, Paloc C, Barandarian
I, Jiménez JM (2007) Improve: an innovative application for col-
laborative mobile mixed reality design review. Int J Interactive
Des Manuf 1(2):115–126
110. Santos P, Stork A, Gierlinger T, Pagani A, Araújo B, Jota R,
Conti G (2007) Improve: collaborative design review in mobile
mixed reality. In: International Conference on Virtual Reality.
Springer, Berlin, pp 543–553
111. Chastine J, Nagel K, Zhu Y, Hudachek-Buswell M (2008) Stud-
ies on the effectiveness of virtual pointers in collaborative aug-
mented reality. In: 3D User Interfaces, 2008. 3DUI 2008. IEEE
Symposium on IEEE, pp 117–124
112. Lee JY, Rhee GW, Park H (2009) AR/RP-based tangible interac-
tions for collaborative design evaluation of digital products. Int J
Adv Manuf Technol 45(7–8):649–665
113. Hammad A, Wang H, Mudur SP (2009) Distributed augmented
reality for visualizing collaborative construction tasks. J Comput
Civil Eng 23(6):418–427
114. Dolinsky M (2010) Interactive augmented reality system
for product design review. IS&T/SPIE Electron Imaging
7525:75250H–75250H
115. Jota R, Araújo BRD, Bruno LC, Pereira JM, Jorge JA (2010)
Immiview: a multi-user solution for design review in real-time.
J Real Time Image Process 5(2):91–107
116. Januszka M, Moczulski W (2010) Augmented reality for machin-
ery systems design and development. New world situation: new
directions in concurrent engineering. Springer, London
117. Januszka M, Moczulski W (2007) Machinery designing aided
by augmented reality technology. Comput Assist Mech Eng Sci
14(14):621–630
118. Januszka M, Moczulski W (2006) Collaborative augmented real-
ity in CAD design. Mach Dyn Probl 30(3):124–131
119. Moczulski W, Panfil W, Januszka M, Mikulski G (2007) Applica-
tions of augmented reality in machinery design, maintenance and
diagnostics. Recent Advances in Mechatronics. Springer, Berlin,
Heidelberg, pp 52–56
120. Thomas BH, Itzstein GSV, Vernik R, Porter S, Marner MR,
Smith RT etal. (2011) Spatial augmented reality support for
design of complex physical environments. In: IEEE International
Conference on pervasive computing and communications work-
shops, IEEE, pp 588–593
121. Poppe E, Brown R, Johnson D, Recker J (2012) Preliminary
evaluation of an augmented reality collaborative process model-
ling system. In: International Conference on cyberworlds. IEEE
Computer Society, pp 77–84
122. Poppe E, Brown R, Johnson D, Recker J (2011) A prototype aug-
mented reality collaborative process modelling tool. In: Demo
Track of the Nineth Conference on Business Process Manage-
ment 2011, Clermont-Ferrand, France, August. DBLP
123. Ko CH, Chang TC (2012) Evaluation and student perception of
augmented reality-based design collaboration[J]. Management
6(6):6
124. Gauglitz S, Nuernberger B, Turk M (2014) World-stabilized
annotations and virtual scene navigation for remote collabora-
tion. In: Proceedings of the 27th annual ACM symposium on
User interface software and technology. ACM, pp 449–459
125. Gauglitz S, Lee C, Turk M (2012) Integrating the physical envi-
ronment into mobile remote collaboration. In: International Con-
ference on human-computer interaction with mobile devices and
services. ACM, pp 241–250
126. Sodhi RS, Jones BR, Forsyth D, Maciocci G, Maciocci G (2013)
BeThere: 3D mobile collaboration with spatial input. In: Sigchi
Conference on human factors in computing systems. ACM, pp
179–188
127. Gurevich P, Lanir J, Cohen B (2015) Design and implementa-
tion of teleadvisor: a projection-based augmented reality sys-
tem for remote collaboration. Comput Support Coop Work
24(6):527–562
128. Gurevich P, Lanir J, Cohen B, Ran S (2012) TeleAdvisor: a ver-
satile augmented reality tool for remote assistance. In: Sigchi
Conference on human factors in computing systems. ACM, pp
619–622
129. Gül LF, Halıcı M, Uzun C, Esengün M (2016) Understanding
the impact of mobile augmented reality on co-design cognition
and co-modelling. In: International Conference on cooperative
design, visualization and engineering. Springer International
Publishing, pp 362–370
130. Gül LF (2017) Studying gesture-based interaction on a mobile
augmented reality application for co-design activity. J Multi-
modal User Interfaces 12(2):109–124
131. Grandi JG (2017) Design of collaborative 3D user interfaces
for virtual and augmented reality. In: virtual reality (VR), 2017
IEEE. IEEE, pp 419–420
132. Grasset R, Lamb P, Billinghurst M (2005) Evaluation of mixed-
space collaboration. In: Proceedings of the 4th IEEE/ACM Inter-
national Symposium on Mixed and Augmented Reality. IEEE
Computer Society, pp 90–99
133. Ohshima T, Satoh K, Yamamoto H, Tamura H (1998) AR2
Hockey: a case study of collaborative augmented reality. In: Vir-
tual reality annual international symposium, 1998. Proceedings.,
IEEE 1998. IEEE, pp 268–275
134. Wang Y, Zhang S, Yang S, He W, Bai X, Zeng Y (2017) A LINE-
MOD-based markerless tracking approachfor AR applications.
Int J Adv Manuf Technol 89(5–8):1699–1707
135. Agrawala M, Beers AC, McDowall I, Fröhlich B, Bolas M, Han-
rahan P (1997) The two-user Responsive Workbench: support
for collaboration through individual views of a shared space. In:
Proceedings of the 24th annual conference on Computer graphics
and interactive techniques. ACM Press/Addison-Wesley Publish-
ing Co., pp 327–332
136. Yu D, Jin JS, Luo S, Lai W, Huang Q (2009) A useful visualiza-
tion technique: a literature review for augmented reality and its
application, limitation and future direction. Visual information
communication. Springer, Boston, MA, pp 311–337
Engineering with Computers
1 3
137. Papagiannakis G, Singh G, Magnenat-Thalmann N (2008) A sur-
vey of mobile and wireless technologies for augmented reality
systems. Comput Anim Virtual Worlds 19(1):3–22
138. Gutwin C, Greenberg S (1998) Effects of awareness support on
groupware usability. In: Sigchi Conference on Human Factors in
Computing Systems (vol 98, pp 511–518). ACM Press/Addison-
Wesley Publishing Co
139. Szalavári Z, Gervautz M (2010) The personal interaction panel—
a two-handed interface for augmented reality. Comput Graph
Forum 16(3):C335–C346
140. Budhiraja R, Lee GA, Billinghurst M (2013) Interaction tech-
niques for HMD-HHD hybrid AR systems. IEEE International
Symposium on mixed and augmented reality (vol 8783, pp 243–
244). IEEE
141. Billinghurst M (2017) The coming age of empathic computing.
http://www.slide share .net/markn b00/the-comin g-age-of-empat
hic-compu ting
142. Hancock JT, Gee K, Ciaccio K, Lin MH (2008) I’m sad you’re
sad: emotional contagion in CMC. In: ACM Conference on com-
puter supported cooperative work. ACM, pp 295–298
143. Barsade SG (2002) The ripple effect: emotional contagion and
its influence on group behavior. Adm Sci Q 47(4):644–675
144. Cheshin A, Rafaeli A, Bos N (2011) Anger and happiness in
virtual teams: emotional influences of text and behavior on oth-
ers’ affect in the absence of non-verbal cues. Organ Behav Hum
Decis Process 116(1):2–16
145. Höllerer T, Feiner S, Terauchi T, Rashid G, Hallaway D
(1999) Exploring mars: developing indoor and outdoor user
interfaces to a mobile augmented reality system. Comput
Graph23(6):779–785
146. Billinghurst M, Thomas BH (2011) Mobile collaborative aug-
mented reality. In: recent trends of mobile collaborative aug-
mented reality systems. Springer, New York, pp 1–19
147. Jeon S, Hwang J, Kim GJ, Billinghurst M (2010) Interaction with
large ubiquitous displays using camera-equipped mobile phones.
Pers Ubiquitous Comput 14(2):83–94
148. Gül LF, Halici SM (2016) Collaborative design with mobile aug-
mented reality. In: 34th eCAADe Conference Proceedings. Oulu,
Finland, pp 493–500
149. Billinghurst M, Grasset R, Looser J (2005) Designing augmented
reality interfaces. ACM Siggraph Comput Graph39(1):17–22
150. Ishii H, Lakatos D, Bonanni L, Labrune JB (2012) Radical
atoms: beyond tangible bits, toward transformable materials.
Interactions 19(1):38–51
151. Purdy TG, Choi YM (2014) Enhancing augmented reality for
use in product design. In: CHI’14 Extended Abstracts on human
factors in computing systems. ACM, pp 1303–1308
152. Wang X (2008) Exploring augmented reality benefits in collab-
orative conceptualization. In: Computer supported cooperative
work in design, 2008. CSCWD 2008. 12th International Confer-
ence on IEEE, pp 699–704
153. Xing NL (2013) Augmented reality 3D design space. Doctoral
dissertation
154. Ng LX, Ong SK, Nee AYC (2015) Conceptual design using func-
tional 3D models in augmented reality. Int J Interact Des Manuf
9(2):115–133
155. Safin S, Delfosse V, Leclercq P (2010) Mixed-reality prototypes
to support early creative design. The Engineering of Mixed Real-
ity Systems. Springer, London, pp 419–445
156. Fuge M, Yumer ME, Orbay G, Kara LB (2012) Conceptual
design and modification of freeform surfaces using dual shape
representations in augmented reality environments. Comput
Aided Des 44(10):1020–1032
157. Turk M (2014) Multimodal interaction: a review. Pattern Recogn
Lett 36:189–195
158. Deng S, Jiang N, Chang J, Guo S, Zhang JJ (2017) Understanding
the impact of multimodal interaction using gaze informed mid-
air gesture control in 3D virtual objects manipulation. Int J Hum
Comput Stud 105:68–80
159. Quek F, McNeill D, Bryll R, Duncan S, Ma XF, Kirbas C, Ansari
R (2002) Multimodal human discourse: gesture and speech.
ACM Trans Comput Hum Interact 9(3):171–193
160. Shim J, Yang Y, Kang N, Seo J, Han TD (2016) Gesture-based
interactive augmented reality content authoring system using
HMD. Virtual Real 20(1):57–69
161. Henry P, Krainin M, Herbst E, Ren X, Fox D (2010) RGB-D
mapping: using depth cameras for dense 3D modeling of indoor
environments. In: In the 12th International Symposium on Exper-
imental Robotics (ISER)
162. Ismail AW, Billinghurst M, Sunar MS (2015) Vision-based tech-
nique and issues for multimodal interaction in augmented reality.
In: Proceedings of the 8th International Symposium on Visual
Information Communication and Interaction. ACM, pp 75–82
163. Lee JY, Park HM, Lee SH, Shin SH, Kim TE, Choi JS (2014)
Design and implementation of an augmented reality system using
gaze interaction. Multimed Tools Appl 68(2):265–280
164. Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S,
Matusik W, Torralba A (2016) Eye tracking for everyone. Pro-
ceedings of the IEEE conference on computer vision and pattern
recognition, pp 2176–2184
165. Chatterjee I, Xiao R, Harrison C (2015) Gaze + gesture: expres-
sive, precise and targeted free-space interactions. In: Proceedings
of the 2015 ACM on International Conference on Multimodal
Interaction. ACM, pp 131–138
166. Song J, Cho S, Baek SY, Lee K, Bang H (2014) GaFinC: gaze
and Finger Control interface for 3D model manipulation in CAD
application. Comput Aided Des 46:239–245
167. Wu D, Rosen DW, Schaefer D (2014) Cloud-based design and
manufacturing: status and promise. Cloud-Based Design and
Manufacturing (CBDM). Springer, Cham, pp 1–24
168. Wang X, Dunston PS (2013) Tangible mixed reality for remote
design review: a study understanding user perception and accept-
ance. Vis Eng 1(1):1–15
169. Huang JM, Ong SK, Nee AYC (2016) Visualization and inter-
action of finite element analysis in augmented reality. Comput
Aided Des 84:1–14
170. Huang JM, Ong SK, Nee AYC (2015) Real-time finite ele-
ment structural analysis in augmented reality. Adv Eng Softw
87(2):43–56
171. Burdea GC (1999) Invited review: the synergy between virtual
reality and robotics. IEEE Trans Robot Autom 15(3):400–410
172. Seth A, Vance JM, Oliver JH (2011) Virtual reality for assembly
methods prototyping: a review. Virtual Real 15(1):5–20
173. Bordegoni M, Ferrise F (2013) Designing interaction with con-
sumer products in a multisensory virtual reality environment.
Virtual Phys Prototyp 8(1):51–64
174. Wang P, Zhang S, Bai X, Billinghurst M, He W, Sun M, Chen
Y, Lv H, Ji H (2019) 2.5DHANDS: a gesture-based MR remote
collaborative platform. Int J Adv Manuf Technol 2019:1–15
175. Wang P, Zhang S, Bai X, Billinghurst M, He W, Zhang L, Du
J, Wang S (2018) Do you know what i mean? an mr-based col-
laborative platform. In: IEEE International Symposium on Mixed
and Augmented Reality (ISMAR), 2018
Publisher’s Note Springer Nature remains neutral with regard to
jurisdictional claims in published maps and institutional affiliations
... Speci cally, these factors distinctly promoted the popularity of remote collaborative activities, as is the case of training, emergency response, and medical and industrial domains [7,8]. Although video-conferencing has prospered, Augmented Reality (AR) technology provides lots of opportunities lie ahead for more advanced strategies for remote collaboration platforms in training/ procedural tasks and remote assistance [1,3,[9][10][11][12]. For successful AR-based remote collaboration, it is key for partners to timely proper instructions [13], and it can improve user experience and provide a platform that enables two or more partners geographically separated from one another to feel like they are virtually co-located work [13][14][15]. ...
... According to survey papers [1,11,12,30,31] closely related to AR remote collaboration, the SAR interface is one of the most popular conditions(e.g., desktop-based, handheld, head-mounted displays(HMD), SARbased interface) used on the working site and the on-site workers, signi cantly, prefer the SAR-based interface because of freeing up hands and no need to wear a helmet. As con rmed by the literature, there is much research on SAR-based remote collaboration in training, repair, assembly, etc. ...
... AAS-based cues are widely used in remote collaboration on assembly and training because it's easily implemented and freely and naturally express the key visual communication information [1,11,12,30,38]. There has been a large number of scholars have been putting a premium on how remote experts naturally and intuitively express AAS and improve remote collaboration platforms using AAS cues. ...
Preprint
Full-text available
Global events such as pandemics and wars have prompted many individuals to reassess the significance of remote collaboration for training, providing assistance, maintenance, and education. While video conferencing has gained popularity, numerous opportunities still exist for more sophisticated methods of Augmented Reality (AR) remote interaction. Hand-drawn AR sketches/annotations (AAS) are commonly used for expressing visual instructions. However, these freehand drawings are not enough to communicate the specific instructions required for industrial applications. Therefore, oral communication always serves as a critical supplement for addressing misinterpretations and language barriers. In such cases, our work is dedicated to sharing clear instructions based on AAS by the adaptive transformation of instructions (ATI) method. As a result, we present a Spatial AR(SAR) remote collaborative platform that can support converting AAS into standard symbols which provides clear guidance and has been widely accepted by the public. We conduct a formal user study to evaluate the prototype platform concerning performance time, general collaborative experience, usability based on ranking, and users’ preferences. The results indicated that ATI-based cues have a positive rule on remote collaborative training tasks in terms of user experience. More significantly, our work provides valuable implications on the way for further study of gesture-based interaction in AR remote collaboration on training tasks.
... Only by better understanding these topics can the research community mature the current state of Collaborative XR and boost a larger adoption of solutions for remote scenarios worldwide. This lack of systematization in ethics, privacy, and security is evident when analyzing surveys devoted to Collaborative XR, where a clear absence of these topics stands out when compared to other dimensions of the collaborative effort (e.g., team, time, task, and others) [4,6,20,23,31,35,36]. ...
... eye-tracking, gait, height, as well as emotional and physical reactions, among others), and even multi-factor authentication to ensure proper access is given to team members, while also reducing the possibility of attacks. Besides, authentication can also restrict access controls, limiting what team members are able to do within the collaborative process according to their role, so that they only have access to what it is intended to at any time [3,5,8,10,36]. ...
Conference Paper
Scenarios of remote collaboration using eXtended Reality (XR) have seen significant growth in recent years, allowing distributed team members to overlay digital information into the physical world, mak- ing it an ideal platform for certain activities. As more organizations look for XR to maintain productivity and efficiency while collaborating remotely, aspects like ethics, privacy, and security, which have not been explored in detail, become paramount. In this paper, the state of these topics is described and discussed. This is impor- tant given that the success of XR-remote collaboration depends on collaborators’ trust in the technology, which must be developed and used in a way that protects user privacy and security while complying with regulations and mitigating risks. Therefore, ethics, privacy, and security play an important role in building and maintaining confidence. With this, we hope to mobilize the community to start questioning their works in search for how these matters impact and should inform their research.
... d VR technology has become widespread in vocational training and business applications. The utilization of remote technical guidance with a digital twin has been widely implemented globally, with notable applications in product design, vocational training, maintenance, assembly, and other manufacturing operations (Buń et al., 2021;Tan et al., 2021;P. Wang et al., 2020P. Wang et al., , 2021. For instance, these technologies have been effectively employed in professional skills development, including remote education and training, remote machine repair training, and ...
Article
Full-text available
During the COVID-19 pandemic, distance teaching became the main solution, including for the furniture woodworking course at National Taipei University of Technology in Taipei, Taiwan, which relied on video and online software. However, this posed challenges for maintaining teaching quality and achieving objectives in technical practice courses. To address this, this study introduced remote technical guidance using VR in technical practice courses. This method combined distance-teaching and live dual-teacher broadcasts, allowing students to participate in real-time online discussions. During these broadcasts, instructors used VR to demonstrate operations on a virtual platform, explaining as they went along. Students could observe from the operator's perspective, gaining insights into furniture production processes. They also engaged in group interactions, assuming roles like technical operators, thereby mastering key furniture production concepts. This innovative teaching approach offered a solution that combined remote technical guidance with VR. It provided immediate teaching enhancements and problem-solving solutions in the post-COVID-19 era.
... The technologies used in telepresence are closely interleaved with VR, AR and MR. In this context, AR and MR can be used to support collaborative work [100], with applications in telemedicine, teleducation [101][102][103] and codesign in manufacturing [104]. Concepts like local and remote users are also clearly present in remote collaboration, with social and copresence factors affecting the user experience [105]. ...
Article
Full-text available
Three decades ago, telepresence was presented as an idea in the context of remote work and manipulation. Since then, it has evolved into a field combining different technologies and allowing users to have more or less realistic perceptions of immersion in remote environments. This paper reviews telepresence and its recent advances. While not covering all the work conducted in telepresence, this paper provides an array of applications for which telepresence can be envisioned, providing a clear view of the differences between components and functionalities of robotic platforms conceived for telepresence and pointing to the dependence of telepresence on several technological areas. Furthermore, challenges faced by telepresence technologies are shown, with consideration of user experiences. We consider telepresence from different perspectives, focusing on specific parts, making it possible to foresee future directions of research and applications. This review will be useful for researchers working in telepresence and related fields.
Article
This research offers a pragmatic view on the adoption of Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) in designing the built environment. Participants from 20 U.S. states and beyond formed a non-probability sample representing small to mid-sized Architecture, Engineering, and Construction (AEC) firms. The author engaged 59 professional participants through a 26-question online questionnaire, informed by existing literature and reviewed by two industry experts. Three additional expert participants provided comprehensive insights via semi-structured interviews. Results highlight design visualization and client presentations as top AR, VR, and MR applications. Key benefits include improved design assessment, early error detection, and heightened client satisfaction. Design collaboration was less prominent than suggested by the literature. Notable challenges persist in first-time user adoption and cost factors of equipment and training. Thus, the cost-benefit balance drives the dominance of older, lower-end devices found in this study despite the availability of advanced, high-fidelity infrastructure.
Conference Paper
Recent research on remote collaboration focuses on improving the sense of co-presence and mutual understanding among the collaborators, whereas there is limited research on using non-verbal cues such as gaze or head direction alongside their main communication channel. Our system – GlassBoARd – permits collaborators to see each other’s gaze behavior and even make eye contact while communicating verbally and in writing. GlassBoARd features a transparent shared Augmented Reality interface that is situated in-between two users, allowing face-to-face collaboration. From the perspective of each user, the remote collaborator is represented as an avatar that is located behind the GlassBoARd and whose eye movements are contingent on the remote collaborator’s instant eye movements. In three iterations, we improved the design of GlassBoARd and tested it with two use cases. Our preliminary evaluations showed that GlassBoARd facilitates an environment for conducting future user experiments to study the effect of sharing eye gaze on the communication bandwidth.
Chapter
The need for remote collaborative work is constantly increasing. Collaboratively adapting digital content, such as documents and images, has come to a stage where it is part of our daily lives. In comparison, remote collaboration on physical objects has matured at a slower pace, even though this is a possible step towards location-independent cooperation and therefore equality in work. In this paper, we present a structured literature review on computer-supported remote collaboration on physical objects from the last 23 years. Our contribution is threefold: First, we provide a comprehensive analysis of the current state of research on the topic of remote collaboration on physical objects. Second, we identify multiple research gaps, such as inclusion of haptic sense, mutual collaboration, and asynchronous collaboration. Third, we analyze code relationships in the selected publications and provide directions for future work in the form of exploratory research questions. KeywordsLiterature ReviewRemote CollaborationPhysical Objects
Article
Full-text available
With the rapid development of augmented reality (AR) technology and devices, it is widely used in education, design, industry, game, medicine and other fields. It brings new development opportunities for computer-supported cooperative work. In recent years, there has been an increasing number of studies on AR collaboration. Many professional researchers have also summarized and commented on these local and remote applications. However, to the best of our knowledge, there is no comprehensive review specifically on AR-enabled local collaboration (AR-LoCol). Therefore, this paper presents a comprehensive survey of research between 2012 and 2022 in this domain. We surveyed 133 papers on AR-LoCol in Web of Science, 75% of which were published between 2018 and 2022. Next, we provide an in-depth review of papers in seven areas, including time (synchronous and asynchronous), device (hand-held display, desktop, spatial AR, head-mounted display), participants (double and multiple), place (standing, indoor and outdoor), content (virtual objects, annotations, awareness cues and multi-perspective views), and area (education, industry, medicine, architecture, exhibition, game, exterior design, visualization, interaction, basic tools). We discuss the characteristics and specific work in each category, especially the advantages and disadvantages of different devices and the necessity for shared contents. Following this, we summarize the current state of development of AR-LoCol and discuss possible future research directions. This work will be useful for current and future researchers interested in AR-LoCol systems.
Article
Full-text available
Current remote collaborative systems in manufacturing are mainly based on video-conferencing technology. Their primary aim is to transmit manufacturing process knowledge between remote experts and local workers. However, it does not provide the experts with the same hands-on experience as when synergistically working on site in person. The mixed reality (MR) and increasing networking performances have the capacity to enhance the experience and communication between collaborators in geographically distributed locations. In this paper, therefore, we propose a new gesture-based remote collaborative platform using MR technology that enables a remote expert to collaborate with local workers on physical tasks. Besides, we concentrate on collaborative remote assembly as an illustrative use case. The key advantage compared to other remote collaborative MR interfaces is that it projects the remote expert’s gestures into the real worksite to improve the performance, co-presence awareness, and user collaboration experience. We aim to study the effects of sharing the remote expert’s gestures in remote collaboration using a projector-based MR system in manufacturing. Furthermore, we show the capabilities of our framework on a prototype consisting of a VR HMD, Leap Motion, and a projector. The prototype system was evaluated with a pilot study comparing with the POINTER (adding AR annotations on the task space view through the mouse), which is the most popular method used to augment remote collaboration at present. The assessment adopts the following aspects: the performance, user’s satisfaction, and the user-perceived collaboration quality in terms of the interaction and cooperation. Our results demonstrate a clear difference between the POINTER and 2.5DHANDS interface in the performance time. Additionally, the 2.5DHANDS interface was statistically significantly higher than the POINTER interface in terms of the awareness of user’s attention, manipulation, self-confidence, and co-presence.
Conference Paper
Full-text available
Virtual Reality enables users to explore content whose physics are only limited by our creativity. Such limitless environments provide us with many opportunities to explore innovative ways to support productivity and collaboration. We present Spacetime, a scene editing tool built from the ground up to explore the novel interaction techniques that empower single user interaction while maintaining fluid multi-user collaboration in immersive virtual environment. We achieve this by introducing three novel interaction concepts: the Container, a new interaction primitive that supports a rich set of object manipulation and environmental navigation techniques, Parallel Objects, which enables parallel manipulation of objects to resolve interaction conflicts and support design workflows, and Avatar Objects, which supports interaction among multiple users while maintaining an individual users' agency. Evaluated by professional Virtual Reality designers, Spacetime supports powerful individual and fluid collaborative workflows.
Conference Paper
Full-text available
A video tutorial effectively conveys complex motions, but may be hard to follow precisely because of its restriction to a predetermined viewpoint. Augmented reality (AR) tutorials have been demonstrated to be more effective. We bring the advantages of both together by interactively retargeting conventional, two-dimensional videos into three-dimensional AR tutorials. Unlike previous work, we do not simply overlay video, but synthesize 3D-registered motion from the video. Since the information in the resulting AR tutorial is registered to 3D objects, the user can freely change the viewpoint without degrading the experience. This approach applies to many styles of video tutorials. In this work, we concentrate on a class of tutorials which alter the surface of an object.
Article
Full-text available
Multimodal interactions provide users with more natural ways to manipulate virtual 3D objects than using traditional input methods. An emerging approach is gaze modulated pointing, which enables users to perform object selection and manipulation in a virtual space conveniently through the use of a combination of gaze and other interaction techniques (e.g., mid-air gestures). As gaze modulated pointing uses different sensors to track and detect user behaviours, its performance relies on the user's perception on the exact spatial mapping between the virtual space and the physical space. An underexplored issue is, when the spatial mapping differs with the user's perception, manipulation errors (e.g., out of boundary errors, proximity errors) may occur. Therefore, in gaze modulated pointing, as gaze can introduce misalignment of the spatial mapping, it may lead to user's misperception of the virtual environment and consequently manipulation errors. This paper provides a clear definition of the problem through a thorough investigation on its causes and specifies the conditions when it occurs, which is further validated in the experiment. It also proposes three methods (Scaling, Magnet and Dual-gaze) to address the problem and examines them using a comparative study which involves 20 participants with 1,040 runs. The results show that all three methods improved the manipulation performance with regard to the defined problem where Magnet and Dual-gaze delivered better performance than Scaling. This finding could be used to inform a more robust multimodal interface design supported by both eye tracking and mid-air gesture control without losing efficiency and stability.
Conference Paper
This paper describes a case study of building a prototype of an immersive three dimensional (3-D) modeler which supports simple two-handed operations. Designing 3-D objects in a virtual environment has a number of advantages for 3-D geometry creation over designing with traditional computer aided design (CAD) tools. In order to enhance the human-computer interaction in a virtual workspace, two-handed spatial input has been incorporated into a few 3-D designing applications. However, existing 3-D designing tools do not utilize two handed interaction for enhancing the interface sufficiently. Our prototype immersive modeler, VLEGO, employs some features of toy blocks to give flexible two-handed interaction for 3-D design. Features of VLEGO can be summarized as follows: Firstly, VLEGO supports various two-handed operations and hence it makes design environment intuitive and efficient. Secondly, possible location and orientation of primitives are discretely limited so that the user can arrange objects accurately with ease. Finally, the system automatically avoids collisions among primitives and adjusts their positions. As a result, precise design of 3-D objects can be achieved easily by using a set of two-handed operations in intuitive way. This paper describes the design and implementation of VLEGO as well as an experiment for examining the effectiveness of two-handed interaction.
Article
Multi-touch user interfaces (MTUIs) of mobile devices can represent a valuable tool for enhancing co-design practice. In particular, mobile augmented reality (AR) technology that enhanced with MTUIs can offer new possibilities to designers to work and collaborate. This paper describes a series of user studies that took place in the Department of Architecture, at the Istanbul Technical University in January 2016. The users tested the developed AR application. The task of the participants was to design mass volumes of buildings in the given contexts. In the first study, the designer worked solo using analogue tools and two different MTUIs of the AR applications. In the second study, the designers collaborated using analogue tools and an enhanced mobile augmented reality environment. The goal of the study is to understand the affordances of the interfaces and the changes onto designers’ behaviour when they are using the mobile AR applications. The particular focus of the paper is on the characterization of the co-design cognition and interaction behaviour of the designers. The collected data from the empirical studies is analysed with the protocol analysis method using a coding scheme. The results show that the studied AR interfaces afford different modelling actions and design behaviour. The design context was broad conceptual and high-level; the intensity of the actions was also different in the AR sessions. We consider that this knowledge would be informative for the development of the innovative augmented reality applications for supporting the design activity and be guidance for the further developments.
Conference Paper
We present Co-3Deator, a sketch-based collaborative 3D modeling system based on the notion of "team-first" ideation tools, where the needs and processes of the entire design team come before that of an individual designer. Co-3Deator includes two specific team-first features: a concept component hierarchy which provides a design representation suitable for multi-level sharing and reusing of design information, and a collaborative design explorer for storing, viewing, and accessing hierarchical design data during collaborative design activities. We conduct two controlled user studies, one with individual designers to elicit the form and functionality of the collaborative design explorer, and the other with design teams to evaluate the utility of the concept component hierarchy and design explorer towards collaborative design ideation. Our results support our rationale for both of the proposed team-first collaboration mechanisms and suggest further ways to streamline collaborative design.