IoT and Computer Vision Technologies as Key Enablers of Drone-based Industrial Automation
Posted On June 8, 2018
Gone are the times when using drones was tightly associated with the military sphere or regarded as an entertaining hobby. Different industrial areas are experimenting with possible ways to use drones. Some industries are especially efficient in leveraging the potential of drones. Thanks to the development of the technologies standing behind drone software: Internet of Things (IoT) and Computer Vision (CV).
Drone Software: Market is Changing
Sales of drones, classified as Unmanned Aircraft Vehicles (UAVs), are expected to triple in the coming two to three years, according to the Federal Aviation Administration (FAA) report. This means not only a growing number of drones, but also an expanding scope of their application, once they are getting more and more common for commercial application. The current usage of drones is centered on niche use cases – first of all military and entertainment. But the situation is changing. First of all, more investments are coming into the drone development area, providing for its progress.
Big-name tech companies have started various collaboration projects, which unite different expertise and which work for the market development. For example, in May 2018 Microsoft announced its new partnership with DJI regarding the manufacturing of drones with the tech giant’s IoT technologies. Second, constantly reducing costs of components used in drone production provides for decreasing technology costs and as a consequence for a lower price of a drone itself.
What Makes up Computer Vision in UAV-based Solutions
For successful commercial application, drones need to possess the following key features:
- Reliable data capturing, which happens during the flight of the drone;
- Reliable hardware components, which accomplish the data capturing function.
The components used for data capturing include cameras and sensors, pre-installed on the drone. And while different models of cameras fulfil the same function, sensors vary in application and collected data.
The Most Frequently Used Sensors Are:
- Multispectral and Hyperspectral sensors, which identify visible (red, green and blue), infrared and ultraviolet regions in the electromagnetic spectrum;
- Light Detection and Ranging (or Lidar) sensor, which measures the distance to an object by illuminating a target with a laser and analyzing the reflected light;
- Photogrammetry, which is based upon the principle of transforming objects on captured images (i.e. photographs), from 2D into 3D model;
- Heat Detector sensor, acts like a thermal imaging camera and allows drone users to get 3D maps of buildings and landscape;
- Digital Elevation Maps (DEMS) sensor, measures the delay of a light signal between the camera and the subject for each point of the image.
IoT Computer Vision and Drones: How These Technologies Blend into Intelligent Image Processing Solutions
A drone, which is equipped with different cameras and sensors of different types, is capable of collecting extensive volumes of data, which is otherwise not easily accessed by humans. But a drone can be much more than a mere data collector, as it can interconnect with other data collecting devices and pass the gathered data to an operator (i.e. a Ground Station, or GS), connected within a single IoT ecosystem.
Taking into account a large number of sensors, which collect extensive data, a drone needs enough bandwidth to interconnect with other devices and in motion to transmit the data to the GS in real time or in near-real time. Nowadays most popular industrial drones such as Mavic Pro2 and the like use WiFi for these aims, with operating frequency starting from 2G. However, WiFi is not omnipresent, and relying only on it can lead to frequent network congestions.
Computer Vision in Drones: Making the Best out of the Drone Application
Drone manufacturers are working on the application of advanced computer vision technologies paired with Deep Learning and Artificial Intelligence in order to put data captured from sensors to work.
What Operations Drone CV Performs
- Object tracking. Nowadays, object tracking remains quite a manual process, when a drone receives instructions from the GS regarding the target size, its initial position and other parameters, in order to point the onboard camera to acquire the target images. The aim, though, is to make it autonomous thanks to ‘deep learning’ achievements in Convolutional Neural Network (СТТ) building. It means that a drone captures raw real-time data during the flight, processes it with an on-board intelligence system in real time, and makes a human-independent decision based on the processed data.
- Self-navigation. There is an evident turn from a remote control operated by a person, towards an autonomous navigation of drones enabled by computer vision. Getting pre-defined GPS coordinates about departure and destination points, drones are capable of finding the most optimal way and get there without manual control thanks to their Computer Vision functions paired with Artificial Intelligence advances. This feature doesn’t only exclude cases of loss control, but positively influences battery capacity.
- Obstacle detection and collision avoidance technologies. Unfortunately, GPS-only navigation signals can’t solve the problems of collision avoidance. As a result, it is not a rare case when drones smash into trees, pipelines, buildings – or other drones, which quantity in the air will only continue to grow. Here comes a necessity to provide for drone’s ability to detect obstacles, both static and in motion, and avoid them when moving at a high speed.
How Collision Avoidance Systems Work
Collision avoidance systems include 2 key elements:
- Obstacle deteсtion sensors to scan the surroundings (which can be vision, ultrasonic, infrared, ToF, etc.);
- SLAM (standing for ‘Simultaneous Localization and Mapping’) technology, which produces the received images into 3D maps, built with the data from sensors. This software performs the image analysis, extracts the data, analyzes it and provides real-time feedback if it detects an obstacle.
So far the biggest challenge for developers is to develop software, quick enough and precise enough to analyze the data in motion, taken into account the speed of the drone and the time it needs for routing from the track.
Challenges to Massive Adoption of Computer Vision IoT Technologies in Drone Solutions
- Draining battery power. Power consumption is a crucial issue and one of the principal bottlenecks for the drone manufacturing industry. Nowadays the majority of drone manufacturers use Lithium Polymer batteries known as LiPo batteries. Most technically advanced drones can stay in the air for around 30 minutes, and developers are struggling to find an optimal power to weight ratio, which will allow a drone to stay in the air for as long as possible without a massive increase in its weight.
- Immature sensor technologies. Sensors capture and store a set of images, and later turn the data into a digital form so that it can be acted upon. There is constant work upon making sensors smaller and more capable of collecting diverse types of parameters. At the moment hardware manufacturers are still striving to get a perfect combination of the sensor efficiency and its weight.
However, the sensors segment is continuously developing, with new technologies entering and developing the drone manufacture segment. Thus, in 2016 drones with Time of Flight (ToF) depth sensors came to the market. By emitting very short ultra-red light pulses and measuring the return time, they can perform object scanning, distance measurement and much more. In November 2017 drones mounted with methane sensors were launched, thus allowing drones to fly over landfill dumps and monitor the levels of gas being emitted into the atmosphere. Plus they can be used for gas and oil, and energy industries.
Drone Industrial Applications: Top 5 Application Areas in 2018
In some spheres drones empowered with IoT and CV technologies are already applied, and the results are sometimes striking.
In agriculture, the use of drones can give rise to precision farming, i.e. giving extensive information to farmers over their assets . Agriculture is quite a conservative area, but it is adapting step by step to the ‘smart farming’ approach, using connected drones and 3D mapping software. Drones act as part of the IoT ecosystem: their sensors collect data about the state of crops, water, soil and other parameters, as well.
As their cameras leverage deep learning and computer vision on video streams to identify pests, diseases, and nutritional deficiencies in crops. All the data stitch together into a single panoramic picture. The data is collected and analyzed for giving further recommendation regarding watering and fertilizing schedule adjustments, pesticides use and current anomalies, in order to increase yield by decreasing production costs.
According to the Goldman Sachs report, construction companies will spend a total of $11.2 billion on drones between 2016 and 2020, thus becoming the biggest investors on the commercial drone market. Real time aerial inspection, carried out by drones empowered with CV technologies and passed to software for further analyzes, gives extensive information about what is happening on site and where problems can be detected and timely tackled.
Plus, it adds to safer working practices, working in the areas which are hard to reach for workers. One of the bright examples of how big-name companies are deploying drones, is Caterpillar. One of the largest construction equipment manufacturers started investing in drone startup Airware in 2017 for inspecting roofs, construction sites, mining operations, and utilities.
- Infrastructure inspection
Drones can replace humans in fulfilling hazardous work for maintaining infrastructure. They can collect data, which is hardly accessible or hazardous to humans, and transfer it for detecting key infrastructure problems. Oil and Gas industry, solar facilities, power lines are among the first to apply drones for their operation facilities inspection. One of the first examples is Duke Energy (North Carolina), which has been granted permission by the Federal Aviation Administration to test drones, equipped with infrared sensors and cameras, at the Marshall Steam Station in Sherrills Ford, North Carolina in 2015.
A delivery drone, sometimes called ‘parcelcopter’, can be utilized to transport packages, food and other goods. The brightest example in the area is Amazon Prime Delivery, tested in the UK in 2016. As the service is supposed to deliver packages to different areas, both rural and metropolitan, a numerous number of obstacles can be met on its. That’s why it was essential to provide for object recognition and collision avoidance functions, ensured by advanced CV technologies, in order to guarantee a secured delivery.
Not only parcels, but people can be carried by drones, as believed by such companies as Airbus and Ehang Corp, a major Chinese drone maker. Tests of a two-passenger fully autonomous drone, held in Dubai in September, 2017 , were successful enough so that Dubai Crown Prince Sheikh Hamdan bin Mohammed made a statement, that Dubai is planning to make autonomous trips account for a quarter of total trips by 2030.
For the test version of drone, they offer an interesting battery solution: it is equipped with 9 independent battery systems, each taking two hours to fully charge (but the plans are to significantly reduce the recharge time).
To leverage full potential of the drone, developers are moving to the concept of the drone as a device, connected and autonomous at the same time. ‘Connected’ stands for collecting the pre-defined data for further analysis, while ‘autonomous’ implies its ability to act human-independently while in motion in order to properly react to non pre-defined challenges like obstacle avoidance. Big data, collected this way and properly analyzed, could bring benefits to some of the world’s biggest industries and help to solve humanity’s greatest problems.