Back in 2017, the Microsoft Research team developed and open-sourced Aerial Informatics and Robotics Simulation (AirSim). On Monday, the team shared how AirSim can be used to solve the current challenges in the development of autonomous systems.
Microsoft AirSim and its features
Microsoft AirSim is an open-source, cross-platform simulation platform for autonomous systems including autonomous cars, wheeled robotics, aerial drones, and even static IoT devices. It works as a plugin for Epic Games’ Unreal Engine. There is also an experimental release for the Unity game engine.
Here is an example of drone simulation in AirSim:
AirSim was built to address two main problems developers face during the development of autonomous systems. First, the requirement of large datasets for training and testing the systems and second, the ability to debug in a simulator.
With AirSim, the team aims to equip developers with a platform that has various training experiences so that the autonomous systems could be exposed to different scenarios before they are deployed in the real-world.
“Our goal is to develop AirSim as a platform for AI research to experiment with deep learning, computer vision and reinforcement learning algorithms for autonomous vehicles. For this purpose, AirSim also exposes APIs to retrieve data and control vehicles in a platform-independent way,” the team writes.
AirSim provides physically and visually realistic simulations by supporting hardware-in-the-loop simulation with popular flight controllers such as PX4, an open-source autopilot system. It can be easily extended to accommodate various new types of autonomous vehicles, hardware platforms, and software protocols. Its extensible architecture also allows them to quickly add custom autonomous system models and new sensors to the simulator.
AirSim for tackling the common challenges in the autonomous systems’ development
In April, the Microsoft Research team collaborated with Carnegie Mellon University and Oregon State University, collectively called Team Explorer, to solve the DARPA Subterranean (SubT) Challenge. The challenge was to build robots that can autonomously map, navigate, and search underground environments during time-sensitive combat operations or disaster response scenarios.
On Monday, Microsoft’s Senior Research Manager, Ashish Kapoor shared how they used AirSim to solve this challenge. Team Explorer and Microsoft used AirSim to create an “intricate maze” of man-made tunnels in a virtual world. To create this maze the team used reference material from real-world mines to modularly generate a network of interconnected tunnels. This was a high-definition simulation of man-made tunnels that also included robotic vehicles and a suite of sensors.
AirSim also provided a rich platform that Team Explorer could use to test their methods along with generating training experiences for creating various decision-making components for autonomous agents. Microsoft believes that AirSim can also help accelerate the creation of a real dataset for underground environments. “Microsoft’s ability to create near-realistic autonomy pipelines in AirSim means that we can rapidly generate labeled training data for a subterranean environment,” Kapoor wrote.
Kapoor also talked about another collaboration with Air Shepherd and USC to help counter wildlife poaching using AirSim. In this collaboration, they developed unmanned aerial vehicles (UAVs) equipped with thermal infrared cameras that can fly through national parks to search for poachers and animals. AirSim was used to create a simulation of this use case, in which virtual UAVs flew over virtual environments at an altitude from 200 to 400 feet above ground level.
“The simulation took on the difficult task of detecting poachers and wildlife, both during the day and at night, and ultimately ended up increasing the precision in detection through imaging by 35.2%,” the post reads.
These were some of the recent use cases where AirSim was used. To explore more and to contribute you can check out its GitHub repository.