Figure 4 - available via license: CC BY
Content may be subject to copyright.
Example screenshot of the event management page.

Example screenshot of the event management page.

Source publication
Article
Full-text available
This paper proposes a set of director tools for autonomous media production with a teamof drones. There is a clear trend toward using drones for media production, and the director isthe person in charge of the whole system from a production perspective. Many applications,mainly outdoors, can benefit from the use of multiple drones to achieve multi-...

Context in source publication

Context 1
... event management page shows both information on already created events, such as their planned date and time, and available actions to be performed on single events (i.e., edit and delete). An example is shown in Figure 4. If the director or the editorial staff need to create new events, they can select the "Plan new event" item on the left menu of the director home page. ...

Similar publications

Chapter
Full-text available
The development of a navigation system for the landing of a swarm of drones on a movable surface is one of the major challenges in building a fully autonomous platform. Hence, the purpose of this study is to investigate the behaviour of a swarm of ten drones under the mission of soft landing on a movable surface that has a linear speed with the eff...

Citations

... Further investigations, such as those by Montes-Romero et al. [3] and Walmsley and Kersten [4], delve into VPX's broader implications, demonstrating its role in augmenting visual effects, facilitating real-time decision-making, and promoting collaborative production dynamics. These studies emphasize VPX's instrumental role in reshaping filmmaking, fostering a more interconnected and efficient creative environment. ...
Article
Full-text available
This work aims to identify and propose a functional pipeline for indie live-action films using Virtual Production with photorealistic real-time rendering game engines. The new production landscape is radically changing how movies and shows are made. Those were made in a linear pipeline, and now filmmakers can execute multiple tasks in a parallel mode using real-time render-ers with high potential for different types of productions. Four interviews of professionals in the Spanish film and television market were conducted to obtain the whole perspective of the new paradigm. Following those examples, a virtual production set was implemented with an Antilatency tracking system, Unreal Engine (version 5.3), and Aximmetry (version 2023.3.2) as the leading software applications. Results are commented on, presenting how all the work is currently closely connected between pre-production, shooting, and post-production and analyzing its potential in different fields.
... In recent years unmanned aerial vehicle (UAV) has been widely used in agriculture [1], environmental monitoring [2], construction [3], logistics and transportation [4], disaster response [5], scientific exploration [6], entertainment and media [7], and many other fields. Particularly noteworthy is that UAV path planning, as an important research direction, has a direct impact on the practical effects and performance optimization of UAVs in several fields. ...
Article
Full-text available
Unmanned aerial vehicle(UAV) path planning plays an important role in UAV flight, and an effective algorithm is needed to realize UAV path planning. The sand cat algorithm is characterized by simple parameter setting and easy implementation. However, the convergence speed is slow, easy to fall into the local optimum. In order to solve these problems, a novel sand cat algorithm incorporating learning behaviors (LSCSO) is proposed. LSCSO is inspired by the life habits and learning ability of sand cats and incorporates a new position update strategy into the basic Sand Cat Optimization Algorithm, which maintains the diversity of the population and improves the convergence ability during the optimization process. Finally, LSCSO is applied to the challenging UAV 3D path planning with cubic B-spline interpolation to generate a smooth path, and the proposed algorithm is compared with a variety of other competing algorithms. The experimental results show that LSCSO has excellent optimization-seeking ability and plans a safe and feasible path with minimal cost consideration among all the compared algorithms.
... However, its mission snippets offer good readability, and its integration into existing platforms seems lightweight. 7) "Director Tools": Montes-Romero et al. [31], [32] propose an approach to facilitate the capturing of cinematographic shots and scenes with multiple drones. They also claim that approaches to utilize multiple drones already exist, but the expertise required to apply them exceeds those of a pure domain user. ...
Article
Full-text available
Existing software tools for specifying and executing multidrone missions are limited to route planning or tightly coupled to specific drone hardware. We introduce EAMOS (Execution of Aerial Multidrone Missions and Operations Specification Framework), which allows us to specify missions intuitively, text-based, and provides a mission compiler, a mission middle layer, and a distributed drone execution environment. The middle layer wraps the control of individual drone-specific capabilities, such as launch, fly to position, or perform a maneuver, into a public API that transparently utilizes the capabilities of numerous drone platforms. We exploit the Go programming language to implement critical components of the framework and provide an interface for ROS-based drone platforms. EAMOS automates the mission execution on real, virtual, and even hybrid robotic setups involving real and virtual drones. We demonstrate the successful deployment of EAMOS with four missions executed on Pixhawk/PX4-equipped quadcopters and virtual drones simulated with Airsim. We assess the performance of our proposed approach by analyzing the number of nodes and arcs of the mission graphs, which are an essential artifact of our mission compilation, the utilization of ROS service calls during mission execution, and the duration of compilation, deployment, and mission execution. Overall, our experiments showed that our drones correctly behaved during mission execution as expected and specified by their mission, the generated mission artifacts were efficiently manageable, and processing times allowed for a fluent workflow.
... The UAS is one kind of director tool for autonomous media production. It performs on production perspective with bird eye view, multi-view and concurrent shots which are different from the simple production (Romero et al., 2020). ...
Article
Full-text available
The defense industry has played a major impact in enhancing global economy. S-curve 11th is the target industry which promotes Eastern Economic Corridor (ECC) in Thailand. The state-of-the-art in defense technology is increasing all the time including Unmanned Aircraft System (UAS) which can be used for the media production in defense technology. UAS pilots for media production in defense technology are different from other UAS pilots. They require hard and soft skills such as management, decision, planning and knowledge for controller and cinematography including ability to choose and operate the right equipment for filming aerial movies properly and creative talent to film movies. The UAS knowledge and controller can be learned from a remoted pilot license course but the soft skills can be partly developed from experiences. The purpose of this study is to lay guideline for using the UAS media production in defense technology with the expectation of providing specific views and multiple domains in research area. It is a combination of engineering, science, art and management. The content in this article is based on experiences from UAS operation.
... In the context of the proposed architecture, this would facilitate the interaction between the Director and the envisioned autonomous system. Thus, an attempt at building such a language for the case of full live event shooting was carried out and described in [28,35]. ...
... Each shot or Shooting Action contains specific parameters like the shot type (e.g., CMT, FST, etc.), the duration, the starting position, and so on; and it is associated with a triggering Event (e.g., the start/end of a race, targets reaching a point of interest, etc.). Shooting Missions are translated into an XMLbased language [35] that can be interpreted by the autonomous multi-UAV system. Then, the system is able to compute feasible plans to assign Shooting Actions to the different UAVs and execute them. ...
... Further details concerning the Dashboard can be found in [28,35]. ...
Article
Full-text available
Cinematography with Unmanned Aerial Vehicles (UAVs) is an emerging technology promising to revolutionize media production. On the one hand, manually controlled drones already provide advantages, such as flexible shot setup, opportunities for novel shot types and access to difficult-to-reach spaces and/or viewpoints. Moreover, little additional ground infrastructure is required. On the other hand, enhanced UAV cognitive autonomy would allow both easier cinematography planning (from the Director’s perspective) and safer execution of that plan during actual filming; while integrating multiple UAVs can additionally augment the cinematic potential. In this paper, a novel multiple-UAV software/hardware architecture for media production in outdoor settings is proposed. The architecture encompasses mission planning and control under safety constraints, enhanced cognitive autonomy through visual analysis, human-computer interfaces and communication infrastructure for platform scalability with Quality-of-Service provisions. Finally, the architecture is demonstrated via a relevant subjective study on the adequacy of UAV and camera parameters for different cinematography shot types, as well as with field experiments where multiple UAVs film outdoor sports events.
... However, emergency scenarios are not the only ones in which these systems can add value. Ángel Montes-Romero, Arturo Torres-González, Jesús Capitán, Maurizio Montagnuolo, Sabino Metta, Fulvio Negro, Alberto Messina, and Aníbal Ollero [15] propose a set of director tools for autonomous media production with a team of drones. They focus on a language for cinematography mission description and a procedure to translate missions into plans, so a media director that is not necessarily familiar with robots can manage the system. ...
Article
Full-text available
Multi-Robot Systems (MRSs) have emerged as a suitable alternative to single robots to improve current and enable new missions [...]
... This work has been developed within the framework of the EU-funded project MultiDrone 1 , whose objective was to create a complete system for autonomous cinematography with multi- ples UAVs in outdoor sport events (see Figure 1). MultiDrone addressed different aspects to build a complete architecture: a set of high-level tools so that the cinematography director can define shots for the mission [6]; planning methods to assign and schedule the shots among the UAVs efficiently and considering battery constraints [7]; vision-based algorithms for target tracking on the camera image [8], etc. In this paper, we focus on the autonomous execution of shots with a multi-UAV team. ...
... We assume that there is a cinematography director who is in charge of describing the desired shots from a high-level perspective. We created a graphical tool and a novel cinematography language [6] to support the director through this task. Once the mission is specified, the system has planning components [7] that compute feasible plans for the mission, assigning shots to the available UAVs according to shot duration and remaining UAV flight time. ...
Preprint
This paper presents a method for planning optimal trajectories with a team of Unmanned Aerial Vehicles (UAVs) performing autonomous cinematography. The method is able to plan trajectories online and in a distributed manner, providing coordination between the UAVs. We propose a novel non-linear formulation for this challenging problem of computing multi-UAV optimal trajectories for cinematography; integrating UAVs dynamics and collision avoidance constraints, together with cinematographic aspects like smoothness, gimbal mechanical limits and mutual camera visibility. We integrate our method within a hardware and software architecture for UAV cinematography that was previously developed within the framework of the MultiDrone project; and demonstrate its use with different types of shots filming a moving target outdoors. We provide extensive experimental results both in simulation and field experiments. We analyze the performance of the method and prove that it is able to compute online smooth trajectories, reducing jerky movements and complying with cinematography constraints.
... In MultiDrone project, we proposed a new taxonomy for cinematographic shots with drones [3], [4], and with the support of experts from the media production companies involved in the project, we selected a set of representative shots to be implemented autonomously by the system. These shots can be defined by the media director through a highlevel graphical interface with a novel language that we created for cinematography mission description [5]. The director indicates desired shot types, starting times/positions and durations; but she/he does not assign specific drone cinematographers to them. ...
... For instance, a director could design a mission to film a rowing race; and specify a lateral shot from the START_RACE Event to the end of the race, and an orbital shot starting with the FINISH_LINE Event, i.e., when the boats reach the finish line. We proposed a novel cinematography language [5] so that the director's input is written with a specific syntax that is later understandable for our planning components. ...
Preprint
This paper presents a system for the execution of autonomous cinematography missions with a team of drones. The system allows media directors to design missions involving different types of shots with one or multiple cameras, running sequentially or concurrently. We introduce the complete architecture, which includes components for mission design, planning and execution. Then, we focus on the components related to autonomous mission execution. First, we propose a novel parametric description for shots, considering different types of camera motion and tracked targets; and we use it to implement a set of canonical shots. Second, for multi-drone shot execution, we propose distributed schedulers that activate different shot controllers on board the drones. Moreover, an event-based mechanism is used to synchronize shot execution among the drones and to account for inaccuracies during shot planning. Finally, we showcase the system with field experiments filming sport activities, including a real regatta event. We report on system integration and lessons learnt during our experimental campaigns.
... The work lies within the framework of the EU-funded project MULTIDRONE 3 , which has developed autonomous media production with small teams of UAVs. In previous work [1], we proposed a graphical interface and a novel language so that the media production Director could design shooting missions. Thus, the Director specifies the characteristics of multiple shots that should be executed to film a given event. ...
... The workflow of the system starts with the Director describing the shooting mission through the Dashboard, which is an intuitive GUI. We described the Dashboard and the process to transform shooting missions into filming tasks in previous work [1], where we proposed a novel language for the description of media missions. The Mission Controller (MC) receives the mission with the shooting actions and requests the HLP for a plan. ...
Preprint
This paper proposes a planning algorithm for autonomous media production with multiple Unmanned Aerial Vehicles (UAVs) in outdoor events. Given filming tasks specified by a media Director, we formulate an optimization problem to maximize the filming time considering battery constraints. As we conjecture that the problem is NP-hard, we consider a discretization version, and propose a graph-based algorithm that can find an optimal solution of the discrete problem for a single UAV in polynomial time. Then, a greedy strategy is applied to solve the problem sequentially for multiple UAVs. We demonstrate that our algorithm is efficient for small teams (3-5 UAVs) and that its performance is close to the optimum. We showcase our system in field experiments carrying out actual media production in an outdoor scenario with multiple UAVs.
... In MultiDrone project, we proposed a new taxonomy for cinematographic shots with drones [3], [4], and with the support of experts from the media production companies involved in the project, we selected a set of representative shots to be implemented autonomously by the system. These shots can be defined by the media director through a high-level graphical interface with a novel language that we created for cinematography mission description [5]. The director indicates desired shot types, starting times/positions and durations; but she/he does not assign specific drone cinematographers to them. ...
... Figure 3 shows a couple of snapshots of the most representative windows. Further details about the Dashboard and mission design can be seen in [5]. In summary, this component allows the director to design cinematography missions, including all shot descriptions and their triggering Events, when needed. ...
... For instance, a director could design a mission to film a rowing race; and specify a lateral shot from the START_RACE Event to the end of the race, and an orbital shot starting with the FINISH_LINE Event, i.e., when the boats reach the finish line. We proposed a novel cinematography language [5] so that the director's input is written with a specific syntax that is later understandable for our planning components. ...
Article
Full-text available
This paper presents a system for the execution of autonomous cinematography missions with a team of drones. The system allows media directors to design missions involving different types of shots with one or multiple cameras, running sequentially or concurrently. We introduce the complete architecture, which includes components for mission design, planning and execution. Then, we focus on the components related to autonomous mission execution. First, we propose a novel parametric description for shots, considering different types of camera motion and tracked targets; and we use it to implement a set of canonical shots. Second, for multi-drone shot execution, we propose distributed schedulers that activate different shot controllers on board the drones. Moreover, an event-based mechanism is used to synchronize shot execution among the drones and to account for inaccuracies during shot planning. Finally, we showcase the system with field experiments filming sport activities, including a real regatta event. We report on system integration and lessons learnt during our experimental campaigns.