0
http://cdn0.vox-cdn.com/thumbor/lsn7l-FMDiAUerVGMjK3ECjOyYE=/0x0:2040x1289/920x613/filters:focal(960x795:1286x1121)/cdn1.vox-cdn.com/uploads/chorus_image/image/53253557/Microsoft_AIRP_21.0.jpeg

Microsoft is sharing some interesting tools with the open source community today. Developers and researchers will be able to take advantage of a new simulator that will let people test and train robots and drones in a virtual environment to prepare them for moving around the real world. A beta version of Microsoft’s research tool is being made available free of charge on GitHub today through an open source license. It’s just the latest in a line of tools and software that Microsoft has made available to the open source community in recent years.

While some simulators have existed to help test drone paths and prepare devices for autonomous operations, Microsoft claims its latest tool is far more advanced, and more accurately reflects the navigation challenges of the real world. Engineers are already exploring the possibility of training real-life action in virtual worlds, retrofitting games like GTA for this task. You can even test AI creations in Minecraft. Microsoft is using the latest photorealistic technologies, so its simulator will let you guide a drone over a realistic setting with shadows and reflections. 

“You can do a lot of experiments, and even if those experiments fail they have very little cost in real life,” explains Ashish Kapoor, the Microsoft researcher in charge of the project, in an interview with The Verge. “In the real world it's extremely hard to explore all possible things, however in simulation we have the luxury of trying out many different things.”

Developers will be able to generate random environments and crash drones accordingly, but Microsoft isn’t going to limit this to just autonomous vehicles. The initial release of the tool, that Kapoor admits is in its early days, will be geared towards “any kind of autonomous vehicles,” but Kapoor believes it will even be able to help with computer vision or even other data-driven machine learning systems in the future.
         “You can think of this as being a data generator,” explains Kapoo. “If you have any kind of sensor, like a barometer or even maybe say a laser or a radar, you can generate a lot of training data for any of these sensing modalities. You can generate data that you can in turn use to train.”

This idea of gathering training data is essential for researchers to build the algorithms required for autonomous vehicles to respond the correct way. This simulator isn’t designed to replace real-world testing, but it will be used alongside that testing to replicate scenarios hundreds or thousands of times.
     Microsoft’s Aerial Informatics and Robotics Platform includes support for DJI and MavLink drones, so developers don’t have to write separate code to control these drones. Microsoft is planning to add more tools to the platform in the future to help developers build perception abilities and progress the safety of AI-powered autonomous vehicles. You can find Microsoft’s simulator and tools over at the company’s GitHub repository.

Post a Comment

 
Top

;