The world’s first drone deliveries have begun trial runs in the United Kingdom and the U.S. Once primarily used by militaries, small quadcopter and octocopter drones are now so commonplace they are for sale at home improvement stores and toy stores. People are flying drones for fun, for entertainment and for commercial purposes as diverse as filmmaking and farming.
All these uses have one thing in common: The drone’s human operator is required by law to be able to see the drone at all times. Why? The answer is simple: to make sure the drone doesn’t hit anything.
Beyond just wanting not to crash and damage their drones or themselves, drone operators must avoid collisions with people, property and other vehicles. Specifically, federal aviation regulations forbid aircraft – including drones – from flying “so close to another aircraft as to create a collision hazard.” The rules also require that “vigilance shall be maintained by each person operating an aircraft so as to see and avoid other aircraft.” These requirements are commonly referred to simply as “see-and-avoid”: Pilots must see and avoid other traffic.
But that places a significant limitation on drone operations. The whole point of drones is that they are unmanned. Without a human operator on board, though, how can a drone steer clear of collisions? This is a crucial problem for Amazon, Google and any other company that wants to deliver packages with drones.
To be practical, delivery drones would have to be able to fly long distances, well out of sight of a human operator. How, then, can the operator prevent the drone from hitting a tree, building, airplane or even another drone? Although cameras could be mounted on the drone for this purpose, current civil drone video transmission technology is limited to a range of a few miles. As a result, in order to perform long-distance deliveries, the drone must autonomously detect nearby objects and avoid hitting them.
As a drone operations researcher, I keep a close eye on ways to achieve this. New research into sensors – at least some of which come from development of autonomous cars – is making increased autonomy possible for drones, potentially opening the skies to even more innovation.
Radar and lidar
There are two main technologies available for drones to detect nearby objects. The first is radar, developed just before World War II, that sends out radio waves and measures their reflections from obstacles. Radar is still used as the primary system for air traffic controllers to track planes in the sky. Ships also use radar to avoid collisions at night or in foggy conditions.
Lidar, developed more recently, uses laser beams instead of radio waves, and can provide extremely detailed images of nearby features. The catch is that both radar and lidar systems have been bulky, heavy and expensive. That makes them hard to fit on relatively small drones; also, heavier drones require more battery power to stay aloft, which requires bigger (and heavier) batteries.

A small lidar sensor. Velodyne, CC BY-ND
There is hope, though. Research in obstacle sensors and collision avoidance technology for autonomous automobiles has spurred the development of small, lower-cost radar and lidar devices. Once they are sufficiently small, and energy-efficient enough not to quickly drain drone batteries, both types of sensors could help solve the drone “see-and-avoid,” or really, because drones don’t have eyes, the “detect-and-avoid” problem.
An in-flight view
A recent test flight here at Ohio University involved a lidar sensor mounted on a drone. When the drone was approximately five feet above the ground, the lidar was able to create an image of its surroundings.
A lidar image from a drone in flight. Michael Braasch, CC BY-ND
On one side, the image had bushy-looking areas representing trees and foliage. One the other there were parallel lines indicating the location of a building wall. And in the middle were some circular shapes representing the ground. This sort of obstacle detection capability and discernment will be essential for routine drone operation, particularly during takeoff and landing.
We are currently in what might be called the “Wright Brothers era” of drone development. Removing the human from the cockpit has challenged innovators and designers in a number of ways – including solving the task of obstacle detection. But as our technology advances, eventually – just like elevators that used to be operated by humans – people will grow used to the idea of these machines operating autonomously.
Michael Braasch has received funding from the FAA and NASA for drone research.
This article was originally published on The Conversation. Read the original article.


Nvidia Confirms Major OpenAI Investment Amid AI Funding Race
Nvidia, ByteDance, and the U.S.-China AI Chip Standoff Over H200 Exports
AMD Shares Slide Despite Earnings Beat as Cautious Revenue Outlook Weighs on Stock
OpenAI Expands Enterprise AI Strategy With Major Hiring Push Ahead of New Business Offering
Amazon Stock Rebounds After Earnings as $200B Capex Plan Sparks AI Spending Debate
Alphabet’s Massive AI Spending Surge Signals Confidence in Google’s Growth Engine
SpaceX Updates Starlink Privacy Policy to Allow AI Training as xAI Merger Talks and IPO Loom
Nintendo Shares Slide After Earnings Miss Raises Switch 2 Margin Concerns
Oracle Plans $45–$50 Billion Funding Push in 2026 to Expand Cloud and AI Infrastructure
TSMC Eyes 3nm Chip Production in Japan with $17 Billion Kumamoto Investment
SpaceX Seeks FCC Approval for Massive Solar-Powered Satellite Network to Support AI Data Centers
Nvidia CEO Jensen Huang Says AI Investment Boom Is Just Beginning as NVDA Shares Surge
Sony Q3 Profit Jumps on Gaming and Image Sensors, Full-Year Outlook Raised
SoftBank and Intel Partner to Develop Next-Generation Memory Chips for AI Data Centers
Nvidia Nears $20 Billion OpenAI Investment as AI Funding Race Intensifies 




