Drone Director

Preview image of “Drone Director”
  • Intelligent drones should be able to film sporting events better
  • Photo: Shutterstock

Autonomous drones are revolutionising the film industry with dynamic camera perspectives.

A research team at Carnegie Mellon University in Pennsylvania is developing camera drones that are able to learn visual preferences from humans and then derive aesthetic decisions for autonomous filming from it. The fully automated system does not require GPS tags to locate objects or prior knowledge of the environment.

The implemented technology is called “Deep Reinforcement Learning” and is based on findings from user studies with human participants. They viewed scenes in a photo-realistic simulator with alternating
frontal, back, left and right perspectives and then rated what they saw in the categories “visually appealing” and “artistically interesting”. The researchers identified generic preferences and used them to devise characteristic film techniques that promise mass appeal.

“We’re putting the power of a director inside the drone,” explains Rogerio Bonatti, a Ph.D. student in CMU’s Robotics Institute. “The drone positions itself to record the most important aspects in a scene. It autonomously understands the context of the scene — where obstacles are, where actors are — and it actively reasons about which viewpoints are going to make a more visually interesting scene. It also reasons about remaining safe and not crashing.”

In addition to the entertainment industry and live sport events, the developers also see camera drones as potential areas of use for governments and police departments, which currently still use manually flown drones for crowd monitoring and analysis of traffic patterns. “The goal of the research is not to replace humans. We will still have a market for highly trained professional experts,” Bonatti emphasises. “The goal is to democratize drone cinematography and allow people to really focus on what matters to them.