Developing, Integrating & Testing Deep Learning-based Perception for Autonomous Vehicles using MATLAB / Simulink and Unreal Engine

2024-26-0086

01/16/2024

Event
Symposium on International Automotive Technology
Authors Abstract
Content
Abstract: My motivation is to work on titled project is to utilize my three decades of experience in automotive to develop and validate a work flow and technologies for self-driving vehicle for the community and society we live to address challenges faced by a driver. As per the Traffic Safety data, it is observed that major reason for the road accident is due to human errors like decision errors, perception & performance error, who drives the vehicle. Other reasons for accidents are due to failure of vehicle and environment conditions. These are the problems and challenges being addressed through this project to make vehicle driving safe, accident free and having an enjoyable experience. A human driver does two major tasks/control while driving a vehicle. First task is to controls Lateral movement using Steering (Left or right turn) . Second task is to control Longitudinal (forward or slower movement) by applying Braking or Acceleration. These actions are performed by a Human driver in response with detecting an object and Event or route planning etc. Human driver also performs other tasks like activating/deactivating wipers, action on hearing honking etc. Approach followed In the project are as: 1) Develop Perception Model: It involved designing and developing virtual 3D environment (Lane markings, Road signs, vegetation, building ,Signals etc) with road & Sensors (Camera, Lidar, Cars) and actors (Pedestrians, Ego Vehicle, Car) 2) Sensor fusion: It takes data inputs from multiple sensors to get a unified single view perceived by Vehicle using multiple algorithms 3) Planning: It gives instructions/signal to perform action either to do steering (left/right turn) or accelerate /decelerate of the vehicle 4) Control: This system controls the vehicle for steering, braking or acceleration and by how much to change (Vehicle Dynamics) and then final Execution. Project involved two perception work which are a) Lanes detection and b) vehicles detection. Multiple technologies are used in project - a) RoadRunner for creation of Scene & Environment b) Driving Scenario Designer of Matlab for Actors and Sensors c) Simulink and Unreal Engine for Co-Simulation d) Matlab for Data Labeling e) Matlab for Training Data Preparation f) Transfer Deep Learning used to Train the model, Detector and g) Algorithm used is CNN YOLO v2 object detector and h) Matlab used to Test Detector and measured the performance on Matlab/Simulink as integrated solution of simulation model and improvement Finally result is achieved as simulation detector detects the CAR/Vehicle on road. A performance 77 % achieved which is reasonably good. Further work to continue under my PhD work to consider various levels of autonomous vehicle as defined by SAE. Keywords: Autonomous Vehicle, Self-driving, Sensors, perception, Control, Deep learning, Matlab, Unreal Engine, Simulin
Meta TagsDetails
Citation
Sah, R., "Developing, Integrating & Testing Deep Learning-based Perception for Autonomous Vehicles using MATLAB / Simulink and Unreal Engine," SAE Technical Paper 2024-26-0086, 2024, .
Additional Details
Publisher
Published
Jan 16, 2024
Product Code
2024-26-0086
Content Type
Technical Paper
Language
English