Small drones can automate a lot of "air acrobatics" performances, such as the drone that we saw earlier and sew through the windows. However, this scene can only be realized in the laboratory environment.
As you can see in the motion picture, the bright red lights around the drones are cameras. They collect accurate data information at hundreds of times per second and transmit them to the external computer system. The drone is then controlled by the computer. The drone only needs to execute the instructions given by the computer and complete the specified actions.
Therefore, this small drone is not independent. It is more like a "cockroach" and carries out instructions for the huge cameras and computer systems around it.
However, this situation has changed. Recently, drone master Vijay Kumar's laboratory released an amazing new system: they realized that drones automatically sew through the windows with only on-board sensors and computing components. This means that the outside camera is not needed and the external computer is not needed. For the first time, this drone had its own "ideas" and could complete its tasks independently.
The characteristics of this drone system are: rely solely on the airborne sensors and computing components to complete positioning, status assessment and path planning, without the need for any outside help.
The UAV carries an IMU, Qualcomm Snapdragon processor, and Hexagon DSP. It is very lightweight, only 250g weight, the speed of the sewing through the window is 4.5 m/s, the acceleration is 1.5g, and it can be turned 90 degrees.
The main researchers in this system were Giuseppe Loianno, Chris Brunner, and Gary McGrath, both from the University of Pennsylvania Vijay Kumar Laboratory. It was a huge achievement to shrink a system completed by the entire laboratory equipment to a small drone. The research and development of this system took six years.
However, when it is necessary to pay attention, the drone has been notified of the position of the obstacle in advance before completing the action. That is, the position of the obstacle is not what the drone itself obtained in real time. In response, Vijay Kumar and Giuseppe Loianno’s response is:
"The drone has a front stereo camera for intensive mapping, but this is not integrated into the real-time trajectory planning and control framework (but this will be completed soon). In this paper, we Solved the problem of status assessment, trajectory control and planning: how to rely on a single camera and IMU to obtain information to estimate the status of the entire drone and use the on-board processor for trajectory planning and control.â€
Vijay explained that the hardest part is trajectory planning and control because it takes a "creative approach":
“When this drone passes through the window, you will see it gain momentum and turn over. This is done automatically, and it is a smooth movement (compared to the previous flight through the window, It is dismantled into 3 parts and then stitched together. Another challenge is to rely on a single camera and IMU to estimate position and speed while turning off the feedback system at 200 Hz."
Researchers are currently working on real-time mapping, which is a key to obstacle detection and dynamic path planning.
However, we are very happy to see this little thing working outside the lab. There are too many half-open windows in my house. There is no drone to wear. It's a bit wasteful.
Via Spectrum IEEE
Extended reading:
The drone that can be eaten: Use wings of sausage, can hold it?
Look at your baby and catch the drone!
Common Mode Coil For Automotive Product,Magntic Inductor Coil,Emi Common Mode Choke,CMC filter choke inductor
IHUA INDUSTRIES CO.,LTD. , https://www.ihua-magnetics.com