Akcije

Telfor Journal
kako citirati ovaj članak
podeli ovaj članak

Metrika

  • citati u SCIndeksu: 0
  • citati u CrossRef-u:0
  • citati u Google Scholaru:[]
  • posete u poslednjih 30 dana:15
  • preuzimanja u poslednjih 30 dana:11

Sadržaj

članak: 1 od 1  
2020, vol. 12, br. 1, str. 40-45
Development of ADAS perception applications in ROS and "Software-In-the-Loop" validation with CARLA simulator
(naslov ne postoji na srpskom)
aRT-RK Institute for Computer Based Systems, Novi Sad
bUniverzitet u Novom Sadu, Fakultet tehničkih nauka

e-adresastevan.stevic@rt-rk.com, momcilo.krunic@rt-rk.com, marko.dragojevic@rt-rk.com, nives.kaprocki@rt-rk.com
Projekat:
Razvoj digitalnih tehnologija i umreženih servisa u sistemima sa ugrađenim elektronskim komponentama (MPNTR - 44009)

Ključne reči: autonomous driving; perception; ROS; CARLA; AUTOWARE; SIL; ADAS; C++; Python
Sažetak
(ne postoji na srpskom)
Higher levels of autonomous driving are bringing sophisticated requirements and unpredicted challenges. In order to solve these problems, the set of functionalities in modern vehicles is growing in terms of algorithmic complexity and required hardware. The risk of testing implemented solutions in real world is high, expensive and time consuming. This is the reason for virtual automotive simulation tools for testing are heavily acclaimed. Original Equipment Manufacturers (OEMs) use these tools to create closed sense, compute, act loop to have realistic testing scenarios. Production software is tested against simulated sensing data. Based on these inputs a set of actions is produced and simulated which generates consequences that are evaluated. This creates a possibility for OEMs to minimize design errors and optimize costs of the vehicle production before any physical prototypes are produced. This paper presents the development of simple C++/Python perception applications that can be used in driver assistance functionalities. Using ROS as a prototyping platform these applications are validated and tested with "Software-In-the Loop" (SIL) method. CARLA simulator is used as a generator for input data and output commands of the autonomous platform are executed as simulated actions within simulator. Validation is done by connecting Autoware autonomous platform with CARLA simulator in order to test against various scenes in which applications are applicable. Vision based lane detection, which is one of the prototypes, is also tested in a real world scenario to demonstrate the applicability of algorithms developed with simulators to real-time processing.
Reference
*** LGSVL simulator. https://www.lgsvlsimulator.com/ [September 2019]
*** Unreal engine 4. https://docs.unrealengine.com/en-US/index.html [September 2019]
*** (2019) SAE levels of driving automation. https://www.sae.org/news/2019/01/sae-updates-j3016-automateddriving-graphic [Accessed: September 2019]
*** ROS bridge for CARLA simulator. https://github.com/carlasimulator/ros-bridge [September 2019]
*** (2019) Waymo simulated kilometres. https://techcrunch.com/2019/07/10/waymo-has-now-driven-10billion-autonomous-miles-in-simulation/ [Accessed: September 2019]
*** Autoware.io. https://github.com/Autoware-AI/autoware.ai [July 2002]
*** OpenDRIVE. http://www.opendrive.org/ [September 2019]
*** rFpro. http://www.rfpro.com/ [September 2019]
*** Autoware in Carla. https://github.com/carla-simulator/carlaautoware [September 2019]
*** Lane finding project. https://github.com/stevanStevic/Advanced-Lane-Lines-Finding [April 2020]
*** Nvidia drive constellation. https://www.nvidia.com/en-gb/self-driving-cars/drive-constellation/ [September 2019]
*** Gazebo simulator. http://gazebosim.org/ [April 2020]
*** Robot operating system. https://www.ros.org/ [September 2019]
Behere, S., Törngren, M. (2015) A functional architecture for autonomous driving. u: First International Workshop on Automotive Software Architecture (WASA), Montreal, QC, 3-10
Dosovitskiy, A., Ros, G., Codevilla, F., López, A., Koltun, V. (2017) CARLA: An open urban driving simulator. CoRL
Elfes, A. (1989) Using occupancy grids for mobile robot perception and navigation. Computer, vol. 22, no. 6, pp. 46-57, June
Fang, J., Zhou, D., Yan, F., Zhao, T., Zhang, F., Ma, Y., Wang, L., Yang, R. (2018) Augmented LiDAR simulator for autonomous driving
Jeong, S., Kwak, Y., Woo, J.L. (2016) Software-in-theLoop simulation for early-stage testing of AUTOSAR software component. u: Eighth International Conference on Ubiquitous and Future Networks (ICUFN), Vienna, 59-63
Jin, F., Feilong, Y., Tongtong, Z., Feihu, Z., Dingfu, Z., Ruigang, Y., Yu, M., Liang, W. (2018) Simulating LIDAR point cloud for autonomous driving using real world scenes and traffic flows
Kalra, N., Paddock, S.M. (2016) Driving to safety: How many miles of driving would it take to demonstrate autonomous vehicle reliability?. Santa Monica, CA: RAND Corporation
Matthaei, R., Maurer, M. (2015) Autonomous driving: A top-down-approach. Automatisierungstechnik, 63(3)
Stević, S., Krunić, M., Dragojević, M., Kaprocki, N. (2019) Development and validation of ADAS perception application in ROS environment integrated with CARLA simulator. u: 27th Telecommunications Forum (TELFOR), Belgrade, Serbia, 1-4
 

O članku

jezik rada: engleski
vrsta rada: neklasifikovan
DOI: 10.5937/telfor2001040S
primljen: 01.05.2020.
revidiran: 17.07.2020.
prihvaćen: 18.07.2020.
objavljen: 31.07.2020.
objavljen u SCIndeksu: 09.10.2020.

Povezani članci

Nema povezanih članaka