# How can I map a specific location in MATLAB using the drone images with corresponding gps data latitude and longitude?

48 views (last 30 days)

Show older comments

##### 2 Comments

### Answers (2)

AndresVar
on 27 Feb 2022

Edited: AndresVar
on 27 Feb 2022

You can use geoplot: Plot line in geographic coordinates - MATLAB geoplot (mathworks.com)

You can ignore geolimits first, get a nice a view, query the limits and then set it.

lats = [34.07293; 34.08];

longs = [-118.44867; -118.45];

id_str = {'1';'2'}; % maybe use the file name instead or some timedate format

data = table(id_str,lats,longs)

geoplot(data.lats,data.longs,'*m')

text(data.lats,data.longs,data.id_str,'Color','m','FontSize',12)

geolimits([34.0452 34.0978],[-118.4800 -118.4])

geobasemap streets

##### 3 Comments

AndresVar
on 1 Mar 2022

@Cristel Esmerna parse and collect the data in the loop but plot outside the loop otherwise you might need to use hold on

I don't know why it would show wrong location. Geoplot worked well with a location I got off googlemaps. Verify you lat and lon for just 1 photo on google maps.

William Rose
on 27 Feb 2022

My interpretation of your question is that you would like to show shapes on a basemap that will indicate the area covered by each drone image. In general, each rectangular drone image will map to a curvy quadrilateral on the basemap. You know the lat and long of the corners your basemap image, and therefore you can easily compute the lat and long of any point in the basemap, by simple linear interpolation from the corner points.

Approach 1. For a start, let's assume the ground covered by the basemap is all at the same altitude. Then you need to known the lat and long of the drone, which you said you have. You also need to know its altitude above the ground and the 3D orientation of the camera (which is described by 3 parameters, or by the 3x3 rotation matrix, which only has 3 independent parameters, even though it has 9 elements). You need to know the camera projection properties, which is a non-trivial thing. If you have all that information, then you can compute the lat and long of the edge points of your image, and you can draw a shape on th basemap with those points.

If you do not know the camera orientation and camera projection properties, then you need to identify several fiducial points, also known as reference points, on the base map, and associate them with corresponding points in the drone image. Then you use the associated points to solve for the constants in the projection equations that relate the drone image to the base map. Depending on the equations you choose for the projection, you may end up with an over-determined set of equations to solve. This will happen if you have M constants to solve for in the projection equations, and you have more than M/2 fiducial points - because you get 2 identities from every fiducial point If the system is overdetermined, then you take a least-squares approach to estimate the constants.

Example: Assume the camera projection is affine, which means that straight lines on the ground get mapped to straight lines in the camera image. Each point on the base map has a latitude ϕand longitude λ. The points in the drone image are described by the pixel coordinates: row i, i=1:M, and column j, j=1:N. The affine projection equations can be written as

With 3 non-co-linear fiducial points, you can solve for the unknowns a11,...,a23. If you have more than 3 fiducial points, you use singular value decomposition to find the values of the six constants that minimize the errors between the actual and predicted basemap coordinates of the fiducial points. Once you have the six unknowns, you use the projection equations to determine the latitudes and longitudes of the corners of the image, and you draw straight lines betweeen those points on the basemap.

##### 7 Comments

Beth Carreon
on 22 Mar 2022

### See Also

### Categories

### Products

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!