Generate Scenario from Actor Track List and GPS Data
This example shows how to generate a scenario containing actor trajectories by using data extracted from a Global Positioning System (GPS) and an actor track list.
Generating scenarios from recorded sensor data enables you create scenarios mimicking real-world actor behaviors, such as front vehicle cut-in for adaptive cruise control ACC. These scenarios, generated from real-world sensor data, can improve the test coverage of automated driving systems. As unlike manual scenario creation, they are scalable and less prone to human error. This example shows how to automatically create a scenario from recorded sensor data.
This figure shows these steps.
Smooth GPS data and format actor track list
Reconstruct ego vehicle trajectories
Extract ego vehicle driven roads from map
Extract non-ego actor properties from track list
Build scenario with roads, ego vehicle and non-ego actors
Load Sensor Data
This example requires the Scenario Builder for Automated Driving Toolbox™ support package. Check if the support package is installed and, if it is not installed, install it using the Get and Manage Add-Ons.
checkIfScenarioBuilderIsInstalled
Download a ZIP file containing a subset of sensor data from the PandaSet data set, and then unzip the file. This file contains GPS data, an actor track list, and camera information. In this example, you use the camera data for visual validation of the generated scenario.
dataFolder = tempdir; dataFilename = "PandasetSensorData_23a.zip"; url = "https://ssd.mathworks.com/supportfiles/driving/data/"+dataFilename; filePath = fullfile(dataFolder,dataFilename); if ~isfile(filePath) websave(filePath,url); end unzip(filePath,dataFolder) dataset = fullfile(dataFolder,"PandasetSensorData"); data = load(fullfile(dataset,"sensorData.mat"));
Load the GPS data into the workspace.
gpsData = data.GPSData;
gpsData
is a table with these columns:
timeStamp
— Time, in seconds, at which the GPS data was collected.latitude
— Latitude coordinate value of the ego vehicle. Units are in degrees.longitude
— Longitude coordinate value of the ego vehicle. Units are in degrees.altitude
— Altitude coordinate value of the ego vehicle. Units are in meters.
Display the first five entries of gpsData.
gpsData(1:5,:)
ans=5×4 table
timeStamp latitude longitude altitude
__________ ________ _________ ________
1.5576e+09 37.374 -122.06 42.858
1.5576e+09 37.374 -122.06 42.858
1.5576e+09 37.374 -122.06 42.854
1.5576e+09 37.374 -122.06 42.849
1.5576e+09 37.374 -122.06 42.848
Load the actor tracklist data. Alternatively, you can generate actor tracklist by processing raw camera or Lidar sensor data. For more information on how to generate actor tracklist from camera data, see Extract Vehicle Track List from Recorded Camera Data for Scenario Generation example. For more information on how to generate tracklist from Lidar data, see Extract Vehicle Track List from Recorded Lidar Data for Scenario Generation example.
% Use helperExtractTracklist to extract actor tracklist from data.
tracklist = helperExtractTracklist(data)
tracklist = actorTracklist with properties: TimeStamp: [400×1 double] TrackIDs: {400×1 cell} ClassIDs: {400×1 cell} Position: {400×1 cell} Dimension: {400×1 cell} Orientation: {400×1 cell} Velocity: [] Speed: [] StartTime: 1.5576e+09 EndTime: 1.5576e+09 NumSamples: 400 UniqueTrackIDs: [20×1 string]
tracklist
is an actorTracklist object containing the tracklist data.
Read first few rows of data and display the output.
sampleData = readData(tracklist, RowIndices = (1:5)')
sampleData=5×2 table
TimeStamp ActorInfo
__________ ____________
1.5576e+09 {6×1 struct}
1.5576e+09 {6×1 struct}
1.5576e+09 {6×1 struct}
1.5576e+09 {6×1 struct}
1.5576e+09 {6×1 struct}
Display ActorInfo
for the first time stamp.
sampleData.ActorInfo{1,1}
ans=6×1 struct array with fields:
TrackID
ClassID
Position
Dimension
Yaw
Pitch
Roll
Speed
Velocity
Load the camera data recorded from a forward-facing monocular camera mounted on the ego vehicle.
cameraData = data.CameraData;
The camera data is a table with two columns:
timeStamp
— Time, in seconds, at which the image data was captured.fileName
— Filenames of the images in the data set.
The images are located in the Camera
folder in the dataset
directory. Create a table that contains the file paths of these images for each timestamp by using the helperUpdateTable
function.
imageFolder = "Camera";
cameraData = helperUpdateTable(cameraData,dataset,imageFolder);
Display the first five entries of cameraData.
cameraData(1:5,:)
ans=5×2 table
timeStamp filePath
__________ _____________________________________________
1.5576e+09 {["/tmp/PandasetSensorData/Camera/0001.jpg"]}
1.5576e+09 {["/tmp/PandasetSensorData/Camera/0002.jpg"]}
1.5576e+09 {["/tmp/PandasetSensorData/Camera/0003.jpg"]}
1.5576e+09 {["/tmp/PandasetSensorData/Camera/0004.jpg"]}
1.5576e+09 {["/tmp/PandasetSensorData/Camera/0005.jpg"]}
Remove data
from workspace to save memory.
clear data
Crop and Preprocess Sensor Data
Crop the GPS, actor track list, and camera data relative to the GPS timestamp range by using the helperCropData
function.
startTime = gpsData.timeStamp(1); endTime = gpsData.timeStamp(end); % Pack all the tables in a cell array. recordedData = {gpsData,tracklist,cameraData}; % Crop the data. recordedData = helperCropData(recordedData,startTime,endTime);
The timestamp values of the recorded data set are in the POSIX® format, which Scenario Builder for Automated Driving Toolbox™ supports. Use the helperNormTimeInSecs
function to normalize the timestamps using these arguments:
scale
— Scale by which to convert the timestamp. Because the recorded timestamps are already in seconds, specify this argument as1
.offset
— Offset of the simulation start time. Specify the start time as the first timestamp ingpsData
.
scale = 1; offset = startTime; recordedData = helperNormTimeInSecs(recordedData,offset,scale);
Extract the GPS data, actor track list, and camera data with updated timestamp values from recordedData.
gpsData = recordedData{1,1}; tracklist = recordedData{1,2}; cameraData = recordedData{1,3};
Remove recordedData
from the workspace.
clear recordedData
Extract Map Roads Using GPS data
Create a geographic player using a geoplayer
object and display the full route using GPS data.
zoomLevel = 16; center = mean([gpsData.latitude gpsData.longitude]); player = geoplayer(center(1),center(2),zoomLevel); plotRoute(player,gpsData.latitude,gpsData.longitude)
Obtain geographic bounding box coordinates from the GPS data by using the getMapROI
function.
mapStruct = getMapROI(gpsData.latitude,gpsData.longitude);
The map file required for importing roads of the specified area is downloaded from the OpenStreetMap® (OSM) website. OpenStreetMap provides access to worldwide, crowd-sourced, map data. The data is licensed under the Open Data Commons Open Database License (ODbL). For more information on the ODbL, see the Open Data Commons Open Database License site.
url = mapStruct.osmUrl; filename = "drive_map.osm"; websave(filename,url,weboptions(ContentType="xml"));
Extract road properties and geographic reference coordinates to use to identify ego roads by using the roadprops
function.
[roadData,geoReference] = roadprops("OpenStreetMap",filename);
Select Ego Roads from Road Network
Convert geographic GPS coordinates to local east-north-up (ENU) coordinates by using the latlon2local
function. The transformed coordinates define the trajectory waypoints of the ego vehicle. Units are in meters.
[xEast,yNorth,zUp] = latlon2local(gpsData.latitude,gpsData.longitude,gpsData.altitude,geoReference); waypoints = [xEast,yNorth,zUp];
Raw GPS data often contains noise. Smooth the GPS waypoints by using the smoothdata
function.
window = round(size(waypoints,1)*0.2);
waypoints = smoothdata(waypoints,"rloess",window);
If GPS data suffers from inaccuracies in position and orientation, then you must improve your ego vehicle localization to generate an accurate ego trajectory. For more information, see the Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation example.
Create the ego trajectory using the waypoints and their corresponding times of arrival by using the waypointTrajectory
(Sensor Fusion and Tracking Toolbox) System object™. For this example assume that no tracked vehicle leaves the ground at any of the waypoints, and therefore set all altitude values to 0
. You must set the ReferenceFrame
property of this System object to "ENU" because Scenario Builder for Automated Driving Toolbox™ supports only the ENU format for local coordinate data.
waypoints = double([waypoints(:,1:2) zeros(size(zUp))]);
egoTrajectory = waypointTrajectory(waypoints,gpsData.timeStamp,ReferenceFrame="ENU");
Extract the road properties for the roads on which the ego vehicle is traveling by using the selectActorRoads
function.
egoRoadData = selectActorRoads(roadData,egoTrajectory.Waypoints);
Display the first five entries of the egoRoadData
table.
egoRoadData(1:5,:)
ans=5×10 table
RoadID JunctionID RoadName RoadCenters RoadWidth BankAngle Heading Lanes LeftBoundary RightBoundary
______ __________ _____________________ _____________ _________ ____________ ____________ ____________ _____________ _____________
41 0 "West El Camino Real" { 2×3 double} 14.55 {2×1 double} {2×1 double} 1×1 lanespec {22×3 double} {22×3 double}
42 0 "West El Camino Real" { 2×3 double} 14.55 {2×1 double} {2×1 double} 1×1 lanespec {13×3 double} {13×3 double}
44 0 "West El Camino Real" {24×3 double} 14.55 {3×1 double} {3×1 double} 1×1 lanespec {27×3 double} {27×3 double}
47 0 "West El Camino Real" { 2×3 double} 14.55 {2×1 double} {2×1 double} 1×1 lanespec {22×3 double} {22×3 double}
48 0 "West El Camino Real" { 3×3 double} 14.55 {3×1 double} {3×1 double} 1×1 lanespec {31×3 double} {31×3 double}
Visualize the selected roads by using the helperPlotRoads
function. Notice that the selected ego roads do not have lane information. Define a crop window of the form [x y width height] to crop, zoom in, and display the map. The x and y elements are the coordinates of the top-left corner of the crop window.
cropWindow = [-30 -30 60 60]; helperPlotRoads(egoRoadData,cropWindow);
Extract Non-Ego Actor Properties
Visualize the actor track list and camera images by using the birdsEyePlot
and helperPlotActors
functions.
% Initialize the figure with bird's eye plot. currentFigure = figure(Visible="on",Position=[0 0 1400 600]); hPlot = axes(uipanel(currentFigure,Position=[0 0 0.5 1],Title="Non-Ego Actors")); bep = birdsEyePlot(XLim=[0 70],YLim=[-35 35],Parent=hPlot); camPlot = axes(uipanel(currentFigure,Position=[0.5 0 0.5 1],Title="Camera View")); helperPlotActors(bep,camPlot,tracklist,cameraData)
Extract actor properties such as entry time, exit time, and dimension from the track list data by using the actorprops
function. The function uses extracted ego trajectory information to return the non-ego actor properties in the world frame. Data from the sensors is often noisy which results in inaccurate waypoints. Remove noise from the non-ego actor waypoints by using the helperSmoothWaypoints
function.
nonEgoActorInfo = actorprops(tracklist,egoTrajectory,SmoothWaypoints=@helperSmoothWaypoints,SaveAs="none");
Display the first five entries of nonEgoActorInfo.
nonEgoActorInfo(1:5,:)
ans=5×14 table
Age TrackID ClassID EntryTime ExitTime Dimension Mesh Time Waypoints Speed Roll Pitch Yaw IsStationary
___ _______ _______ _________ ________ _______________________ ______________________ ______________ ______________ ______________ ______________ ______________ ______________ ____________
10 "2" 1 0 0.89983 2.037 5.273 1.825 1×1 extendedObjectMesh { 10×1 double} { 10×3 double} { 10×1 double} { 10×1 double} { 10×1 double} { 10×1 double} true
400 "3" 1 0 39.9 1.911 4.672 1.527 1×1 extendedObjectMesh {400×1 double} {400×3 double} {400×1 double} {400×1 double} {400×1 double} {400×1 double} false
10 "4" 1 0 0.89983 2.043 4.537 1.87 1×1 extendedObjectMesh { 10×1 double} { 10×3 double} { 10×1 double} { 10×1 double} { 10×1 double} { 10×1 double} false
139 "5" 1 0 13.799 2.199 4.827 1.968 1×1 extendedObjectMesh {139×1 double} {139×3 double} {139×1 double} {139×1 double} {139×1 double} {139×1 double} false
400 "6" 1 0 39.9 1.981 4.974 1.58 1×1 extendedObjectMesh {400×1 double} {400×3 double} {400×1 double} {400×1 double} {400×1 double} {400×1 double} false
Simulate Scenario with Ego and Non-Ego Actors
Define parameters to create a driving scenario. Specify endTime
as GPS end time as final timestamp in the GPS data, and calculate sampleTime
as the minimum difference between two consecutive GPS timestamps.
endTime = gpsData.timeStamp(end); sampleTime = min(diff(gpsData.timeStamp));
Create a driving scenario by using the drivingScenario
object. Specify the SampleTime
and StopTime
properties of the driving scenario.
scenario = drivingScenario(SampleTime=sampleTime,StopTime=endTime);
Roads extracted from OpenStreetMap do not contain lane information. Create reference lane specifications by using the lanespec
function. Specify three lanes with an approximate width of four meters per lane, based on a visual inspection of camera data. Update the egoRoadData
with the lane information.
lanes = lanespec(3,Width=4); egoRoadData.Lanes(:,1) = lanes;
Add roads with the updated lane specifications by using the helperAddRoads
function. This helper function adds roads to the driving scenario by using the road
function.
scenario = helperAddRoads(scenario,egoRoadData);
Add the ego vehicle waypoints to the scenario by using the helperAddEgo
function.
scenario = helperAddEgo(scenario,egoTrajectory);
Add the non-ego actors and their trajectories to the scenario by using the helperAddNonEgo
function.
scenario = helperAddNonEgo(scenario,nonEgoActorInfo);
Visualize the generated scenario and compare it with the recorded camera data by using the helperViewScenario
function.
currentFigure = figure(Name="Generated Scenario",Position=[0 0 700 500]);
helperViewScenario(currentFigure,scenario,cameraData)
Export Scenario to ASAM OpenSCENARIO
Export the generated scenario to the ASAM OpenSCENARIO 1.0 file format by using the export
function.
fileName = "example_scenario.xosc"; export(scenario,"OpenSCENARIO",fileName);
You can view the exported scenario in external simulators such as ESMINI.
See Also
Functions
Related Topics
- Overview of Scenario Generation from Recorded Sensor Data
- Smooth GPS Waypoints for Ego Localization
- Ego Vehicle Localization Using GPS and IMU Fusion for Scenario Generation
- Extract Lane Information from Recorded Camera Data for Scene Generation
- Generate RoadRunner Scene from Recorded Lidar Data
- Generate High Definition Scene from Lane Detections and OpenStreetMap
- Generate RoadRunner Scenario from Recorded Sensor Data