InterDrone 2016, only in its second year, brought together everyone from the drone industry – mixing leaders and new entrants, manufacturers and end users, top notch speakers with world class questions in an open and collaborative environment found rarely in the conference circuit for UAS. We had NASA represented by PK discussing the latest in Unmanned Traffic Management (UTM), Keynote Speeches from 3D Robotics, DJI, Google, Yuneec, Intel, AirMap and the rest of the usual world class suppliers in the UAS field discussing the latest in technological trends, and we had experts in the field of risk management or operations (including Wolf UAS) discussing how to turn your operation into a professional and safe one, rather than just another hobby with a Square Card reader. With the opportunity to speak to these amazing attendees, and provide my insight into Precision Agriculture alongside along side brilliant fellow panelists, I am looking forward to going again next year.
The real story was how diverse, intelligent, and prolific the group of attendees were, who joined these top notch speakers all looking to learn how to make their way in the videographic, agricultural, or construction/inspection markets using unmanned aircraft platforms. This conference distinguished itself as being easily on-par with the leader in the field – Xponential and Commercial Drone Expo – in terms of speakers (largely they had the same speakers that attend AUVSI each and every year) and as it grows it is sure to expand its “show floor.” While this post could be another wide-angle look at the various suppliers or trends at the show, I’ve decided to look in one piece of technology represented at InterDrone 2016 that really has me excited: First Person Augmented Reality for Drones.
It finally happened, the technology I have been pushing and looking for in the drone market is starting to crop up from different vendors and manufacturers, sourced from very different markets all looking to solve the need of first person view deficiencies. My dream for UAS, ever since I put on my first FPV goggles flying a push-prop fixed wing roughly 8 years ago in flight training, was to apply human factors theory to an optical interface on a FAA compliant platform. While I may not have had the technical expertise in programming, design, or apparently printing, InterDrone 2016 highlighted not only one, but four distinct players in this field (3 Represented in this article). Before I jump right in to what they showed, let’s discuss why interface and the user experience matters most in unmanned robotics.
Simply put, the unmanned aircraft user interface is the lifeline for any pilot, engineer, or other crew member to understand the health, position, and determinant direction of the aircraft. Unmanned robotics of all flavors require the individual to create a mental model of where your vehicle is, what it is doing, and where it will be at a certain place in time. Until now, visual line of sight or advanced positioning systems have been the only solution in the sUAS marketplace while in the larger UAS world, sensor balls combined with transponder, ADS-B, and other spectrum heavy data links provide that knowledge. These glasses, overlaid with accurate information that you would normally find throughout the DJI Go APP or other software suites requisite for flight control, are the first step in providing information to a pilot who can, without taking their eyes of their aircraft, make aeronautical decisions with a much greater awareness of surroundings. While I it does not replace the need for reliable C2 links or positioning information; it does make access that information far easier than ever before. Ultimately, the technology is not quite there, but one day we’ll see my true dream realized – a computer generated interface over FPV goggles for a Mario-Kart like interaction between racing drones, or training simulators that overlay way points, a “points system” for acceptable training, or even optimal routing and traffic notation with airspace and previously generated maps for a myriad of missions and payloads applications.
The show included at least four different variants on this theme of FPV information access, but I only tried three of the various versions as the others were essentially upgraded Fat Shark Goggles with better viewing. They weren’t transparent and didn’t really offer anything new to the market place. These goggles have been known for a while in FPV racing (essentially enabling that activity), but they don’t offer modification or integration of information beyond the camera and basic speed/direction.
The offerings from Brother and Epson – both printing companies utilizing previously generated technology used in printing and scanning platforms – use the same approach. They attempt to provide transparent lenses with optical projections either on the lens of the glasses (Epson) or directly onto the lens of the eye (Brother). I am told both have their benefits, in terms of glare, sun exposure, etc. It leaves me wondering, however, the overall effect on the eye and fatigue strain with the muscles involved. The offering from Epson was also able to control the sensor ball by head movement which may be amazing for a pilot when flying without a sensor operator, but without proper training and continued currency, may become disorienting.
Avegant, previously an entertainment company, is also making its way into this space by adapting a previously media consumption device. While I appreciated the overall design and application of the technology, it seems as though covering your ears with speakers that do not relay sound from the aircraft to report aircraft health or anything useful only limits your situational awareness both on the air and on the ground. Avegant’s platform, though the coolest looking and able to control the platform on the drone itself for directional movement, really didn’t overlay as much useful information as the other offerings that this point. The smooth ability to control the Inspire 1 payload was probably it’s greatest strength and really did provide seamless control of the camera.
The promise of this technology is information access, not cool factor, and its important remember that. If the future of UAS is truly commercial then these goggles will need to include the data and information access that helps, not hinders, its pilots. The more integrations with major suppliers that are announced – As seen here – the more likely this informational approach will promote better flying, easier information access, and provide much greater human machine interfaces. I, personally, look forward to the promise of aerial laser tag, computer monitored Mario Kart in the sky, and the gamification of flight training.