Engineering:Air-Cobot
Country | France |
---|---|
Type | Cobot |
Website | aircobot |
Air-Cobot (Aircraft Inspection enhanced by smaRt & Collaborative rOBOT) is a French research and development project of a wheeled collaborative mobile robot able to inspect aircraft during maintenance operations. This multi-partner project involves research laboratories and industry. Research around this prototype was developed in three domains: autonomous navigation, human-robot collaboration and nondestructive testing.
Air-Cobot is presented as the first wheeled robot able to perform visual inspections of aircraft. Inspection robots using other types of sensors have been considered before, such as the European project Robair. Since the launch of the project, other solutions based on image processing began to be developed, such as EasyJet with a drone, the swarm of drones from Toulouse company Donecle and the Aircam project of the aerospace manufacturer Airbus.
Since the beginning of the project in 2013, the Air-Cobot robot is dedicated to inspect the lower parts of an aircraft. In the continuation of the project, there is the prospect of coupling with a drone to inspect an aircraft's upper parts. In October 2016, Airbus Group launched its research project on the hangar of the future in Singapore. The robots from the Air-Cobot and Aircam projects are included in it.
Project description
Objectives
Launched in January 2013,[1] the project is part of the Interministerial Fund program of Aerospace Valley, a business cluster in southwestern France.[2] With a budget of over one million euros,[3] Air-Cobot aims to develop an innovative collaborative mobile robot, autonomous in its movements and able to perform the inspection of an aircraft with nondestructive testing sensors during preflight or during maintenance operations in a hangar.[2][4] Testing has been performed at the premises of Airbus and Air France Industries.[5]
Partners
The project leader is Akka Technologies. There are two academic partners; Akka Technologies and four other companies make up the five commercial partners.[6]
- Academic partners
- Armines and Institut Clément Ader of the École des mines d'Albi-Carmaux are in charge of nondestructive testing.[6][7]
- Laboratoire d'analyse et d'architecture des systèmes (LAAS-CNRS) with the Robotics, Action and Perception (RAP) team handles the autonomous navigation.[6][7][8]
- Industrial partners
- Akka Technologies, particularly the center for research and development Akka Research Toulouse, leads the project and brings skills in image analysis, navigation and aircraft maintenance.[3][6][7][9]
- Airbus Innovations is the initiator of the project, providing CAD models of the Airbus A320 and developing operating scenarios.[3][6][7]
- 2MoRO Solutions, a company based in the French Basque Country, is in charge of the maintenance information system.[6][7]
- M3 System, a Toulouse-based company, takes care of the outdoor localization solution based on the Global Positioning System (GPS).[6][7][10]
- Sterela, based in the south of Toulouse, provides the 4MOB mobile platform.[6][7][11]
Project finance
Project finance is provided by banque publique d'investissement, the Aquitaine Regional Council, the Pyrénées-Atlantiques Departemental Council, the Midi-Pyrénées Regional Council and by the European Union.[12]
Expected benefits
Aircraft are inspected during maintenance operations either outdoors on an airport between flights, or in a hangar for longer-duration inspections. These inspections are conducted mainly by human operators, visually and sometimes using tools to assess defects.[A 1] The project aims to improve inspections of aircraft and traceability. A database dedicated to each aircraft type, containing images and three-dimensional scans, will be updated after each maintenance. This allows for example to assess the propagation of a crack.[4][13]
The human operator's eyes fatigue over time while an automatic solution ensures reliability and repeatability of inspections. The decrease in time taken for inspections is a major objective for aircraft manufacturers and airlines. If maintenance operations are faster, this will optimize the availability of aircraft and reduce maintenance operating costs.[4][13]
Robot equipment
All electronics equipment is carried by the 4MOB mobile platform manufactured by Sterela. The off-road platform, equipped with four-wheel drive, can move at a speed of 2 metres per second (7.2 kilometres per hour (4.47 mph)).[11] Its lithium-ion battery allows an operating time of eight hours. Two bumpers are located at the front and at the rear. These are obstacle detection bumpers. They stop the platform if they are compressed.[11]
The cobot weighs 230 kilograms (507 lb). It has two computers, one running Linux for the autonomous navigation module and the other Windows for the non-destructive testing module. The robot is equipped with several sensors. The pan-tilt-zoom camera manufactured by Axis Communications and Eva 3D scanner manufactured by Artec 3D are dedicated to inspection. The sensors for navigation are an inertial measurement unit; two benches, each equipped with two PointGrey cameras; two Hokuyo laser range finders; and a GPS unit developed by M3 Systems that allows for geofencing tasks in outdoor environments.[3][7]
The autonomous navigation of the Air-Cobot robot is in two phases. The first, navigation in the airport or the factory, allows the robot to move close to the aircraft. The second navigation, around the aircraft, allows the robot to position itself at control points referenced in the aircraft virtual model. In addition, the robot must insert itself in a dynamic environment where humans and vehicles are moving. To address this problem, it has an obstacle avoidance module. Many navigation algorithms are constantly running on the robot with real time constraints. Searches are conducted on optimizing the computing time.[citation needed][clarification needed]
In an outdoor environment, the robot is able to go to the inspection site by localizing through Global Positioning System (GPS) data. The GPS device developed by M3 Systems allows geofencing. At the airport, the robot operates in dedicated navigation corridors respecting speed limits. Alerts are sent to the operator if the robot enters a prohibited area or exceeds a given speed.[10][A 2]
Another algorithm based on computer vision provides, in real-time, a lane marking detection. When visible, painted lanes on the ground can provide complementary data to the positioning system to have safer trajectories.[A 3] If in an indoor environment or an outdoor environment where GPS information is not available, the cobot can be switch to follower mode to move behind the human operator and follow her or him to the aircraft to inspect.[14][A 2]
To perform the inspection, the robot has to navigate around the aircraft and get to the checkpoints called up in the aircraft virtual model. The position of the aircraft in the airport or factory is not known precisely; the cobot needs to detect the aircraft in order to know its position and orientation relative to the aircraft. To do this, the robot is able to locate itself, either with the laser data from its laser range finders,[A 4] or with image data from its cameras.[A 1][A 5]
Near the aircraft, a point cloud in three dimensions is acquired by changing the orientation of the laser scanning sensors fixed on pan-tilt units. After filtering data to remove floor- or insufficiently large dot clusters, a registration technique with the model of the aircraft is used to estimate the static orientation of the robot. The robot moves and holds this orientation by considering its wheel odometry, its inertial unit and visual odometry.[A 4]
Laser data are also used horizontally in two dimensions. An algorithm provides a real-time position estimation of the robot when enough elements from the landing gears and engines are visible. A confidence index is calculated based on the number of items collected by lasers. If good data confidence is achieved, the position is updated. This mode is particularly used when the robot moves beneath the aircraft.[A 4]
For visual localization, the robot estimates its position relative to the aircraft using visual elements (doors, windows, tires, static ports etc.) of the aircraft. During the evolution of the robot, these visual elements are extracted from a three-dimensional virtual model of the aircraft and projected in the image plane of the cameras. The projected shapes are used for pattern recognition to detect those visual elements.[A 5] The other detection method used is based on the extraction of features with a Speeded Up Robust Features (SURF) approach. A pairing is performed between images of each element to be detected and the actual scene experienced.[A 1]
By detecting and tracking visual landmarks, in addition to estimating its position relative to the aircraft, the robot can perform a visual servoing.[A 6] Research in vision is also conducted on simultaneous localization and mapping (SLAM).[A 7][A 8] A merger of information between the two methods of acquisition and laser vision is being considered. Artificial intelligence arbitrating various locations is also under consideration.[A 4][A 1]
Obstacle avoidance
In both navigation modes, Air-Cobot is also able to detect, track, identify and avoid obstacles that are in its way. The laser data from laser range sensors and visual data from the cameras can be used for detection, monitoring and identification of the obstacles. The detection and monitoring are better in the two-dimensional laser data, while identification is easier in the images from the cameras; the two methods are complementary. Information from laser data can be used to delimit work areas in the image.[A 6][A 9][A 10]
The robot has several possible responses to any obstacles. These will depend on its environment (navigation corridor, tarmac area without many obstacles, cluttered indoor environment etc.) at the time of the encounter with an obstacle. It can stop and wait for a gap in traffic, or avoid an obstacle by using a technique based on a spiral, or perform path planning trajectories.[A 6][A 10]
Computing time optimization
Given the number of navigation algorithms calculating simultaneously to provide all the information in real time, research has been conducted to improve the computation time of some numerical methods using field-programmable gate arrays.[A 11][A 12][A 13] The research focused on visual perception. The first part was focused on the simultaneous localization and mapping with an extended Kalman filter that estimates the state of a dynamic system from a series of noisy or incomplete measures.[A 11][A 13] The second focused on the location and the detection of obstacles.[A 12]
Non-destructive testing
Image analysis
After having positioned to perform a visual inspection, the robot performs an acquisition with a pan-tilt-zoom camera. Several steps take place: pointing the camera, sensing the element to be inspected, if needed repointing and zooming with the camera, image acquisition and inspection. Image analysis is used on doors to determine whether they are open or closed; on the presence or absence of protection for certain equipment; the state of turbofan blades or the wear of landing gear tires.[A 14][A 15][A 16][A 17]
The detection uses pattern recognition of regular shapes (rectangles, circles, ellipses). The 3D model of the element to be inspected can be projected in the image plane for more complex shapes. The evaluation is based on indices such as the uniformity of segmented regions, convexity of their forms, or periodicity of the image pixels' intensity.[A 14]
The feature extraction using speeded up robust features (SURF) is also able to perform the inspection of certain elements having two possible states, such as pitot probes or static ports being covered or not covered. A pairing is performed between images of the element to be inspected in different states and that present on the scene. For these simple items to be inspected, an analysis during navigation is possible and preferable due to its time saving.[A 1][A 18]
Point cloud analysis
After having positioned to perform a scan inspection, the pantograph elevates the 3D scanner at the fuselage. A pan-tilt unit moves the scan device to acquire the hull. By comparing the data acquired to the three-dimensional model of the aircraft, algorithms are able to diagnose any faults in the fuselage structure and provide information on their shape, size and depth.[15][A 19][A 20]
By moving the pan-tilt units of the laser range finders, it is also possible to obtain a point cloud in three dimensions. Technical readjustment between the model of the aircraft and the scene point cloud is already used in navigation to estimate the static placement of the robot. It is planned to make targeted acquisitions, simpler in terms of movement, to verify the absence of chocks in front of the landing gear wheels, or the proper closing of engine cowling latches.[A 4]
Collaboration human-robot
As the project name suggests, the mobile robot is a cobot – a collaborative robot. During phases of navigation and inspection, a human operator accompanies the robot; he can take control if necessary, add inspection tasks, note a defect that is not in the list of robot checks, or validate the results. In the case of pre-flight inspections, the diagnosis of the walk-around is sent to the pilot who decides whether or not to take off.[7][14][A 21]
Other robotic inspection solutions
European project Robair
The inspection robot of the European project Robair, funded from 2001 to 2003, is designed to mount on the wings and fuselage of an aircraft to inspect rows of rivets. To move, the robot uses a flexible network of pneumatic suction cups that are adjustable to the surface. It can inspect the lines of rivets with ultrasonic waves, eddy current and thermographic techniques. It detects loose rivets and cracks.[16][17][18]
EasyJet drone
Airline EasyJet is interested in the inspection of aircraft with drones. It made a first inspection in 2015. Equipped with laser sensors and high resolution camera, the drone performs autonomous flight around the aeroplane. It generates a three-dimensional image of the aircraft and transmits it to a technician. The operator can then navigate in this representation and zoom to display a high-resolution picture of some parts of the aircraft. The operator must then visually diagnose the presence or absence of defects. This approach avoids the use of platforms to observe the upper parts of the aeroplane.[19]
Donecle drone
Founded in 2015, Donecle, a Toulouse start-up company, has also launched a drone approach which was initially specialized in the detection of lightning strikes on aeroplanes.[20][21] Performed by five people equipped with harnesses and platforms, this inspection usually takes about eight hours. The immobilization of the aircraft and the staff are costly for the airlines, estimated at $10 000 per hour. The solution proposed by the start-up lasts twenty minutes.[21]
Donecle uses a swarm of drones equipped with laser sensors and micro-cameras. The algorithms for automatic detection of defects, trained on existing images database with a machine learning software, are able to identify various elements: texture irregularities, pitot probes, rivets, openings, text, defects, corrosion, oil stains. A damage report is sent on the operator's touch pad with each area of interest and the proposed classification with a probability percentage. After reviewing the images, the verdict is pronounced by a qualified inspector.[21]
Project continuation
In 2015, in an interview given to the French weekly magazine Air & Cosmos, Jean-Charles Marcos, chief executive officer (CEO) of Akka Research, explained that once developed and marketed the Air-Cobot should cost between 100,000 and 200,000 euros. He could meet civilian needs in nondestructive testing and also military ones.[3] A possible continuation of the project could be the use of the robot on aircraft larger than the Airbus A320. The CEO also revealed that Akka Technologies plans to work on a duo of robots for inspection: the same mobile platform for the lower parts, and a drone for the upper parts. If funding is allocated then this second phase would take place during the period 2017–2020.[3]
At the Singapore Airshow in February 2016, Airbus Group presented Air-Cobot and its use in its vision of the hangar of the future.[22] The same month, the Singapore government enlisted Airbus Group to help local maintenance, repair, and operations providers to stay competitive against neighbour countries like Indonesia, Thailand and the Philippines which are cheaper. To improve productivity, Airbus Group launches, in October 2016, a testbed hangar where new technologies can be tested. Upon entering the hangar, cameras study the aircraft to detect damages. Mobile robots, such as the one of the Air-Cobot project, and drones, such as the one of the Aircam project, carry out more detailed inspections.[23]
During the 14th International Conference on Remote Engineering and Virtual Instrumentation in March 2017, Akka Research Toulouse, one of the centers for research and development of Akka Technologies, presents its vision of the airport of the future.[A 2] In addition of Air-Cobot, a previous step in this research axis is Co-Friend, an intelligent video surveillance system to monitor and improve airport operations.[A 2][24] Futur researches will focus on the management of this operations, autonomous vehicles, non-destructive testing and human-machine interactions to increase efficiency and security on airports.[A 2] From August 2017, the robot comes in once a month in Aeroscopia, an aeronautics museum of Blagnac. The researchers of the project take advantage of the collection to test the robot and acquire data on other aircraft models such as Airbus A400M, Airbus A300 and Sud-Aviation SE 210 Caravelle.[25]
Communications
On 23 October 2014, a patent was filed by Airbus.[26] From 2014 to 2016, the robot had presentations in five exhibitions including Paris Air Show 2015,[1][27][28] and Singapore Airshow 2016.[22][29] The research developed in the project was presented in eighteen conferences. Twenty-one scientific articles were published seventeen conference proceedings and four journal articles.[30] Part of publications is centered on navigation and/or inspection by Air-Cobot while the rest focuses on specific numerical methods or hardware solutions related to the issues of the project. During the international conference Machine Control and Guidance (MCG) of 2016, the prize for the best final application is awarded to the authors of the publication Human-robot collaboration to perform aircraft inspection in working environment.[31]
On 17 April 2015, Airbus Group distributed a project presentation video, made by the communication agency Clipatize, on its YouTube channel.[14][32] On 25 September 2015, Toulouse métropole broadcasts a promotional video on its YouTube channel. Toulouse metropolis is presented as an attractive ecosystem, able to build the future and highlights its visibility internationally. The Air-Cobot demonstrator was chosen to illustrate the robotics research of this metropolis.[33] Located at Laboratoire d'analyse et d'architecture des systèmes during development, researchers or engineers working on the project regularly present a demonstration to visitors (external researchers, industrial partners, or students); it was also demonstrated to the general public during the 2015 Feast of Science.[34] Airbus Group, on 17 February 2016, broadcast a YouTube video presentation of its vision of the hangar of the future in which it plans to use Air-Cobot.[22]
See also
Notes and references
Research publications of the project
- ↑ 1.0 1.1 1.2 1.3 1.4 1.5 Villemot, Larnier & Vetault 2016, RFIA
- ↑ 2.0 2.1 2.2 2.3 2.4 Donadio et al. 2017, REV
- ↑ Bauda, Bazot & Larnier 2017, ECMSM
- ↑ 4.0 4.1 4.2 4.3 4.4 4.5 4.6 4.7 4.8 4.9 Frejaville, Larnier & Vetault 2016, RFIA
- ↑ 5.0 5.1 5.2 Jovancevic et al. 2016, ICPRAM
- ↑ 6.0 6.1 6.2 Futterlieb, Cadenat & Sentenac 2014, ICINCO
- ↑ Esparza-Jiménez, Devy & Gordillo 2014, FUSION
- ↑ Esparza-Jiménez, Devy & Gordillo 2016, Sensors
- ↑ Lakrouf et al. 2017, ICMRE
- ↑ 10.0 10.1 Leca et al., ECC
- ↑ 11.0 11.1 Tertei, Piat & Devy 2014, ReConFig
- ↑ 12.0 12.1 Alhamwi, Vandeportaele & Piat 2015, ICVS
- ↑ 13.0 13.1 Tertei, Piat & Devy 2016, CEE
- ↑ 14.0 14.1 14.2 Jovancevic et al. 2015, JEI
- ↑ Jovancevic et al. 2015a, QCAV
- ↑ Jovancevic et al. 2015b, CMOI
- ↑ Jovancevic et al. 2016, MECO
- ↑ Leiva et al. 2017, ECMSM
- ↑ Jovancevic et al. 2017, I2M
- ↑ Bauda, Grenwelge & Larnier 2018, ETRSS
- ↑ Donadio et al. 2016, MCG
Proceedings
- Futterlieb, Marcus; Cadenat, Viviane; Sentenac, Thierry (2014). "A navigational framework combining Visual Servoing and spiral obstacle avoidance techniques". Informatics in Control, Automation and Robotics (ICINCO), 2014 11th International Conference on, Vienna: 57–64. https://hal.archives-ouvertes.fr/hal-01354855/document.
- Esparza-Jiménez, Jorge Othón; Devy, Michel; Gordillo, José Luis (2014). "EKF-based SLAM fusing heterogeneous landmarks". 17th International Conference on Information Fusion (FUSION): 1–8. https://hal.archives-ouvertes.fr/hal-01354861/document.
- Tertei, Daniel Törtei; Piat, Jonathan; Devy, Michel (2014). "FPGA design and implementation of a matrix multiplier based accelerator for 3D EKF SLAM". International Conference on ReConFigurable Computing and FPGAs (ReConFig14): 1–6. https://hal.archives-ouvertes.fr/hal-01354873/document.
- Jovancevic, Igor; Orteu, Jean-José; Sentenac, Thierry; Gilblas, Rémi (April 2015a). Meriaudeau, Fabrice; Aubreton, Olivier. eds. "Automated visual inspection of an airplane exterior". Proceedings of SPIE. Twelfth International Conference on Quality Control by Artificial Vision 2015 9534: 95340Y. doi:10.1117/12.2182811. Bibcode: 2015SPIE.9534E..0YJ. https://hal.archives-ouvertes.fr/hal-01351735/document.
- (in French) Jovancevic, Igor; Orteu, Jean-José; Sentenac, Thierry; Gilblas, Rémi (November 2015b). "Inspection d'un aéronef à partir d'un système multi-capteurs porté par un robot mobile". Actes du 14ème Colloque Méthodes et Techniques Optiques pour l'Industrie. https://hal.archives-ouvertes.fr/hal-01350898/document.
- Alhamwi, Ali; Vandeportaele, Bertrand; Piat, Jonathan (2015). "Real Time Vision System for Obstacle Detection and Localization on FPGA". Computer Vision Systems – 10th International Conference, ICVS 2015: 80–90. https://hal.archives-ouvertes.fr/hal-01355008/document.
- Jovancevic, Igor; Viana, Ilisio; Orteu, Jean-José; Sentenac, Thierry; Larnier, Stanislas (February 2016). "Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods". 359–366. doi:10.5220/0005756303590366. ISBN 978-989-758-173-1. https://hal.archives-ouvertes.fr/hal-01353317/file/article_ICPRAM2016.pdf.
- Jovancevic, Igor; Arafat, Al; Orteu, Jean-José; Sentenac, Thierry (2016). "Airplane tire inspection by image processing techniques". 5th Mediterranean Conference on Embedded Computing. https://hal.archives-ouvertes.fr/hal-01351750/document.
- (in French) Frejaville, Jérémy; Larnier, Stanislas; Vetault, Stéphane (2016). "Localisation à partir de données laser d'un robot naviguant autour d'un avion". Actes de la conférence Reconnaissance de Formes et Intelligence Artificielle. https://hal.archives-ouvertes.fr/hal-01333650/document.
- (in French) Villemot, Tanguy; Larnier, Stanislas; Vetault, Stéphane (2016). "Détection d'amers visuels pour la navigation d'un robot autonome autour d'un avion et son inspection". Actes de la conférence Reconnaissance de Formes et Intelligence Artificielle. https://hal.archives-ouvertes.fr/hal-01333651/document.
- Donadio, Frédéric; Frejaville, Jérémy; Larnier, Stanislas; Vetault, Stéphane (2016). "Human-robot collaboration to perform aircraft inspection in working environment". Proceedings of 5th International Conference on Machine Control and Guidance. https://mcg2016.irstea.fr/wp-content/uploads/2017/05/MCG2016_paper_42.pdf.
- Lakrouf, Mustapha; Larnier, Stanislas; Devy, Michel; Achour, Nouara (2017). "Moving Obstacles Detection and Camera Pointing for Mobile Robot Applications". Proceedings of the 3rd International Conference on Mechatronics and Robotics Engineering. pp. 57–62. doi:10.1145/3068796.3068816. ISBN 9781450352802. https://hal.archives-ouvertes.fr/hal-01579420/document.
- Donadio, Frédéric; Frejaville, Jérémy; Larnier, Stanislas; Vetault, Stéphane (2017). "Artificial intelligence and collaborative robot to improve airport operations". Proceedings of 14th International Conference on Remote Engineering and Virtual Instrumentation.
- Bauda, Marie-Anne; Bazot, Cécile; Larnier, Stanislas (2017). "Real-time ground marking analysis for safe trajectories of autonomous mobile robots". 2017 IEEE International Workshop of Electronics, Control, Measurement, Signals and their Application to Mechatronics (ECMSM). pp. 1–6. doi:10.1109/ECMSM.2017.7945887. ISBN 978-1-5090-5582-1.
- Leiva, Javier Ramirez; Villemot, Tanguy; Dangoumeau, Guillaume; Bauda, Marie-Anne; Larnier, Stanislas (2017). "Automatic visual detection and verification of exterior aircraft elements". 2017 IEEE International Workshop of Electronics, Control, Measurement, Signals and their Application to Mechatronics (ECMSM). pp. 1–5. doi:10.1109/ECMSM.2017.7945885. ISBN 978-1-5090-5582-1.
- Bauda, Marie-Anne; Grenwelge, Alex; Larnier, Stanislas (2018). "3D scanner positioning for aircraft surface inspection". Proceedings of European Congress Embedded Real Time Software and Systems. https://www.erts2018.org/uploads/program/ERTS_2018_paper_97.pdf.
- Leca, Dimitri; Cadenat, Viviane; Sentenac, Thierry; Durand-Petiteville, Adrien; Gouaisbaut, Frédéric; Le Flécher, Emile (2019). "Sensor-based Obstacles Avoidance Using Spiral Controllers for an Aircraft Maintenance Inspection Robot". Proceedings of European Control Conference: 2083–2089. https://www.researchgate.net/publication/335198917.
Journal articles
- Jovancevic, Igor; Larnier, Stanislas; Orteu, Jean-José; Sentenac, Thierry (November 2015). "Automated exterior inspection of an aircraft with a pan-tilt-zoom camera mounted on a mobile robot". Journal of Electronic Imaging 24 (6): 061110. doi:10.1117/1.JEI.24.6.061110. Bibcode: 2015JEI....24f1110J. https://hal.archives-ouvertes.fr/hal-01351008/document.
- Esparza-Jiménez, Jorge Othón; Devy, Michel; Gordillo, José Luis (2016). "EKF-based SLAM fusing heterogeneous landmarks". Sensors 16 (4): 489. doi:10.3390/s16040489. PMID 27070602. PMC 4851003. https://hal.archives-ouvertes.fr/hal-01354880/document.
- Tertei, Daniel Törtei; Piat, Jonathan; Devy, Michel (2016). "FPGA design of EKF block accelerator for 3D visual SLAM". Computers and Electrical Engineering. https://hal.archives-ouvertes.fr/hal-01354883/document.
- Jovancevic, Igor; Pham, Huy-Hieu; Orteu, Jean-José; Gilblas, Rémi; Harvent, Jacques; Maurice, Xavier; Brèthes, Ludovic (2017). "Détection et caractérisation de défauts de surface par analyse des nuages de points 3D fournis par un scanner" (in fr). Instrumentation, Mesure, Métrologie, Lavoisier 16: 261–282. https://hal.archives-ouvertes.fr/hal-01660998/document.
PhD thesis reports
- Jovancevic, Igor (2016). Exterior inspection of an aircraft using a Pan-Tilt-Zoom camera and a 3D scanner moved by a mobile robot: 2D image processing and 3D point cloud analysis. École nationale supérieure des mines d'Albi-Carmaux. https://tel.archives-ouvertes.fr/tel-01687831/document.
- Futterlieb, Marcus (2017). Vision based navigation in a dynamic environment. Université Paul Sabatier. https://hal.laas.fr/tel-01624233/document.
Other references
- ↑ 1.0 1.1 (in French) Xavier Martinage (17 June 2015). "Air-Cobot : le robot dont dépendra votre sécurité". La Chaîne Info. http://lci.tf1.fr/economie/entreprise/air-cobot-le-robot-dont-dependra-votre-securite-8622912.html.
- ↑ 2.0 2.1 (in French) "Air-Cobot : un nouveau mode d'inspection visuelle des avions". Les pôles de compétitivité. http://competitivite.gouv.fr/projets-en-cours-fui-investissements-d-avenir/fiche-projet-r-d-aide-355/air-cobot-253.html?cHash=03b3756b6aaf38b6899b6b169842a060.
- ↑ 3.0 3.1 3.2 3.3 3.4 3.5 (in French) Olivier Constant (11 September 2015). "Le projet Air-Cobot suit son cours". Air et Cosmos (2487). http://www.pressreader.com/france/air-cosmos/20150911/282003261203383. Retrieved 12 July 2016.
- ↑ 4.0 4.1 4.2 (in French) "Rapport d'activité 2013–2014 de l'Aerospace Valley". Aerospace Valley. http://www.aerospace-valley.com/sites/default/files/documents/ra_14_exe_bd.pdf.
- ↑ 5.0 5.1 (in French) "News du projet Air-Cobot". Akka Technologies. https://aircobot.akka.eu/?q=page/news.
- ↑ 6.0 6.1 6.2 6.3 6.4 6.5 6.6 6.7 (in French) "AKKA Technologies coordonne le projet Air-COBOT, un robot autonome d'inspection visuelle des avions". Capital. 1 July 2014. http://www.capital.fr/bourse/communiques/akka-technologies-akka-technologies-coordonne-le-projet-air-cobot-un-robot-autonome-d-inspection-visuelle-des-avions.-945346. Retrieved 14 July 2016.
- ↑ 7.0 7.1 7.2 7.3 7.4 7.5 7.6 7.7 7.8 (in French) "Air-Cobot, le robot qui s'assure que vous ferez un bon vol !". Planète Robots (38): 32–33. March–April 2016. https://issuu.com/planeterobots/docs/planete_robots_038-17p.
- ↑ (in French) "Contrats RAP". Laboratoire d'analyse et d'architecture des systèmes. https://www.laas.fr/public/fr/contrats-rap.
- ↑ (in French) "Akka Technologies : une marque employeur orientée sur l'innovation". Le Parisien. 15 February 2016. http://www.leparisien.fr/economie/emploi/top-employeur/akka-technologies-une-marque-employeur-orientee-sur-l-innovation-15-02-2016-5546759.php#xtref=https%3A%2F%2Fwww.google.fr. Retrieved 17 July 2016.
- ↑ 10.0 10.1 "M3 Systems Flagship Solution". M3 Systems. http://jupiter-egnss-its.eu/events-and-activities/company-showcase/m3-systems/.
- ↑ 11.0 11.1 11.2 (in French) "4MOB, plateforme intelligente autonome". Sterela Solutions. http://www.sterela.fr/documents/50/4mob_fr.pdf.
- ↑ (in French) "Financeurs". Akka Technologies. https://aircobot.akka.eu/?q=page/financeurs.
- ↑ 13.0 13.1 (in French) Véronique Guillermard (18 May 2015). "Aircobot contrôle les avions avant le décollage". Le Figaro. http://www.lefigaro.fr/societes/2015/06/18/20005-20150618ARTFIG00009-aircobot-controle-les-avions-avant-le-decollage.php. Retrieved 14 July 2016.
- ↑ 14.0 14.1 14.2 Air-Cobot on YouTube
- ↑ (in French) Pascal NGuyen (December 2014). "Des robots vérifient l'avion au sol". Sciences et Avenir (814). http://www.sciencesetavenir.fr/high-tech/20141205.OBS7120/des-robots-verifient-l-avion-au-sol.html. Retrieved 17 July 2016.
- ↑ (in French) "Robair, Inspection robotisée des aéronefs". European Commission. http://cordis.europa.eu/result/rcn/85695_fr.html.
- ↑ "Robair". London South Bank University. http://www1.lsbu.ac.uk/esbe/mrndt/robair.shtml.
- ↑ Shang, Jianzhong; Sattar, Tariq; Chen, Shuwo; Bridge, Bryan (2007). "Design of a climbing robot for inspecting aircraft wings and fuselage". Industrial Robot 34 (6): 495–502. doi:10.1108/01439910710832093. http://researchopen.lsbu.ac.uk/2795/1/62.%20CLAWAR%202006-Design%20of%20a%20climbing%20robot.pdf.
- ↑ (in French) Newsroom (8 June 2015). "Easy Jet commence à utiliser des drones pour l'inspection de ses avions". Humanoides. https://humanoides.fr/2015/06/easy-jet-commence-a-utiliser-des-drones-pour-linspection-de-ses-avions/.
- ↑ (in French) Florine Galéron (28 May 2015). "Aéronautique : la startup Donecle invente le drone anti-foudre". Objectif News, la Tribune. http://objectifnews.latribune.fr/innovation/start-up/2015-08-28/aeronautique-la-startup-donecle-invente-le-drone-anti-foudre.html. Retrieved 16 July 2016.
- ↑ 21.0 21.1 21.2 (in French) Arnaud Devillard (20 April 2016). "Des drones pour inspecter des avions". Sciences et Avenir. http://www.sciencesetavenir.fr/high-tech/20160420.OBS8889/des-drones-pour-inspecter-des-avions.html. Retrieved 16 July 2016.
- ↑ 22.0 22.1 22.2 Innovations in Singapore: the Hangar of the Future on YouTube
- ↑ "Pimp my Hangar: Excelling in MRO". Airbus. http://www.airbusgroup.com/int/en/news-media/corporate-magazine/Forum-89/Hangar-of-the-future.html.
- ↑ (in French) Éric Parisot (21 June 2013). "Co-Friend, le système d'analyse d'images qui réduit les temps d'immobilisation des avions". Usine Digitale. https://www.usine-digitale.fr/article/co-friend-le-systeme-d-analyse-d-images-qui-reduit-les-temps-d-immobilisation-des-avions.N199908. Retrieved 24 February 2018.
- ↑ (in French) Aeroscopia, ed (August 2017). "Le Musée accueille le projet AIR-COBOT". http://www.musee-aeroscopia.fr/fr/actualites/le-mus%C3%A9e-accueille-le-projet-air-cobot.
- ↑ "Espacenet – Bibliographic data – Collaborative robot for visually inspecting an aircraft". worldwide.espacenet.com. http://worldwide.espacenet.com/publicationDetails/biblio?CC=WO&NR=2015059241&KC=&locale=en_EP&FT=E.
- ↑ (in French) Juliette Raynal; Jean-François Prevéraud (15 June 2015). "Bourget 2015 : les dix rendez-vous technos à ne pas louper". Industrie et Technologies. http://www.industrie-techno.com/bourget-2015-les-dix-rendez-vous-technos-a-ne-pas-louper.38838. Retrieved 16 July 2016.
- ↑ (in French) "Akka Technologies au Salon du Bourget". Maurice Ricci. 21 June 2015. http://www.mauricericci.com/akka-technologies-au-salon-du-bourget/.
- ↑ "Singapore Airshow 2016 Trends: Emerging Technologies Take Off – APEX | Airline Passenger Experience". apex.aero. http://apex.aero/2016/02/24/singapore-airshow-2016-trends-emerging-technologies.
- ↑ "Communications du projet Air-Cobot" (in fr). Akka Technologies. https://aircobot.akka.eu/?q=page/communications.
- ↑ "Best MCG2016 Final Application Award" (in en). Machine Control and Guidance. October 2016. https://mcg2016.irstea.fr/wp-content/uploads/2017/05/MCG2016_bestmcg2016finalapplicationpaper.pdf.
- ↑ "AirCobot – Introducing Smart Robots for Aircraft Inspections". Clipatize. http://www.clipatize.com/case-study-folder/airbus_aircobot_3d_video_case_study/.
- ↑ (in French) Toulouse métropole, construire le futur on YouTube
- ↑ "Air-Cobot, le robot d'assistance aux inspections des aéronefs" (in fr). Programme de la fête de la science. 2015. https://www.laas.fr/public/sites/www.laas.fr.public/files/reserved/comm/pdf/LivretFDS2015_light.pdf. Retrieved 17 July 2016.
External links
Original source: https://en.wikipedia.org/wiki/Air-Cobot.
Read more |