In continuation to the analysis of my UAV-LiDAR data based on the extraction of forest tree metrics script that I am tweaking, I am running into some memory trouble. I have been reading a lot of documentation but I cannot seem to wrap my head around this.
label2D = helperSegmentTrees(canopyModel,treeTopRowId,treeTopColId,minTreeHeight);
rowId = ceil((ptCloud.Location(:,2) - ptCloud.YLimits(1))/gridRes) + 1;
colId = ceil((ptCloud.Location(:,1) - ptCloud.XLimits(1))/gridRes) + 1;
ind = sub2ind(size(label2D),rowId,colId);
validSegIds = label3D ~= 0;
ptVeg = select(ptCloud,validSegIds);
while running the ptVeg = select(ptCloud, validSegIds); this is where MATLAB runs out of memory, and displays the following error:
Out of memory.
Error in pointclouds.internal.pc.getSubset
[loc, c, nv, intensity, r] = pointclouds.internal.pc.getSubset( ...
[loc, c, nv, intensity, r] = this.subsetImpl(indices, outputSize);
veglabel3D = label3D(validSegIds);
numColors = max(veglabel3D);
colorMap = randi([0 255],numColors,3)/255;
labelColors = label2rgb(veglabel3D,colorMap,OutputFormat="triplets");
pcshow(ptVeg.Location,labelColors)
title("Individual Tree Segments")
uicontrol('Visible', 'off')
It also shows a similar error while running the subsequent part of the script -
treeMetrics = helperExtractTreeMetrics(normalizedPoints,label3D);
Out of memory.
filteredPoints = normalizedPoints(validLabels,:);
My LiDAR dataset is around 25 gigabytes (colourised point cloud of 35 hectares of a forest stand).
I am using an Image Processing Computer with 256 gigabytes of RAM, a massive hard drive and an NVIDIA GeForce RTX 3090.
Could somebody please point me in the right direction? Super grateful for Mathworks community's help! :)
Attached below is a screenshot for additional information: