The world’s first drone deliveries have begun trial runs in the United Kingdom and the U.S. Once primarily used by militaries, small quadcopter and octocopter drones are now so commonplace they are for sale at home improvement stores and toy stores. People are flying drones for fun, for entertainment and for commercial purposes as diverse as filmmaking and farming.
All these uses have one thing in common: The drone’s human operator is required by law to be able to see the drone at all times. Why? The answer is simple: to make sure the drone doesn’t hit anything.
Beyond just wanting not to crash and damage their drones or themselves, drone operators must avoid collisions with people, property and other vehicles. Specifically, federal aviation regulations forbid aircraft – including drones – from flying “so close to another aircraft as to create a collision hazard.” The rules also require that “vigilance shall be maintained by each person operating an aircraft so as to see and avoid other aircraft.” These requirements are commonly referred to simply as “see-and-avoid”: Pilots must see and avoid other traffic.
But that places a significant limitation on drone operations. The whole point of drones is that they are unmanned. Without a human operator on board, though, how can a drone steer clear of collisions? This is a crucial problem for Amazon, Google and any other company that wants to deliver packages with drones.
To be practical, delivery drones would have to be able to fly long distances, well out of sight of a human operator. How, then, can the operator prevent the drone from hitting a tree, building, airplane or even another drone? Although cameras could be mounted on the drone for this purpose, current civil drone video transmission technology is limited to a range of a few miles. As a result, in order to perform long-distance deliveries, the drone must autonomously detect nearby objects and avoid hitting them.
As a drone operations researcher, I keep a close eye on ways to achieve this. New research into sensors – at least some of which come from development of autonomous cars – is making increased autonomy possible for drones, potentially opening the skies to even more innovation.
Radar and lidar
There are two main technologies available for drones to detect nearby objects. The first is radar, developed just before World War II, that sends out radio waves and measures their reflections from obstacles. Radar is still used as the primary system for air traffic controllers to track planes in the sky. Ships also use radar to avoid collisions at night or in foggy conditions.
Lidar, developed more recently, uses laser beams instead of radio waves, and can provide extremely detailed images of nearby features. The catch is that both radar and lidar systems have been bulky, heavy and expensive. That makes them hard to fit on relatively small drones; also, heavier drones require more battery power to stay aloft, which requires bigger (and heavier) batteries.

A small lidar sensor. Velodyne, CC BY-ND
There is hope, though. Research in obstacle sensors and collision avoidance technology for autonomous automobiles has spurred the development of small, lower-cost radar and lidar devices. Once they are sufficiently small, and energy-efficient enough not to quickly drain drone batteries, both types of sensors could help solve the drone “see-and-avoid,” or really, because drones don’t have eyes, the “detect-and-avoid” problem.
An in-flight view
A recent test flight here at Ohio University involved a lidar sensor mounted on a drone. When the drone was approximately five feet above the ground, the lidar was able to create an image of its surroundings.
A lidar image from a drone in flight. Michael Braasch, CC BY-ND
On one side, the image had bushy-looking areas representing trees and foliage. One the other there were parallel lines indicating the location of a building wall. And in the middle were some circular shapes representing the ground. This sort of obstacle detection capability and discernment will be essential for routine drone operation, particularly during takeoff and landing.
We are currently in what might be called the “Wright Brothers era” of drone development. Removing the human from the cockpit has challenged innovators and designers in a number of ways – including solving the task of obstacle detection. But as our technology advances, eventually – just like elevators that used to be operated by humans – people will grow used to the idea of these machines operating autonomously.
Michael Braasch has received funding from the FAA and NASA for drone research.
This article was originally published on The Conversation. Read the original article.


Microsoft Restores Microsoft 365 Services After Widespread Outage
Elon Musk Seeks $134 Billion in Lawsuit Against OpenAI and Microsoft Over Alleged Wrongful Gains
Tesla Plans FSD Subscription Price Hikes as Autonomous Capabilities Advance
Morgan Stanley Flags High Volatility Ahead for Tesla Stock on Robotaxi and AI Updates
HKEX’s Permissive IPO Rules Could Open Opportunities for Korea to Strengthen Its Position in International Listings
South Korea Seeks Favorable U.S. Tariff Terms on Memory Chip Imports
OpenAI Launches Stargate Community Plan to Offset Energy Costs and Support Local Power Infrastructure
Nintendo Stock Jumps as Switch 2 Becomes Best-Selling Console in the U.S. in 2025
South Korea Sees Limited Impact From New U.S. Tariffs on Advanced AI Chips
Google Seeks Delay on Data-Sharing Order as It Appeals Landmark Antitrust Ruling
SoftBank Shares Surge as AI Optimism Lifts Asian Tech Stocks
ByteDance Finalizes Majority U.S.-Owned TikTok Joint Venture to Avert American Ban
Apple China Holiday Sale Offers Discounts Up to 1,000 Yuan on Popular Devices
Nvidia CEO Jensen Huang Plans China Visit Amid AI Chip Market Uncertainty
Anthropic Appoints Former Microsoft Executive Irina Ghose to Lead India Expansion 




