semanticseg producing marginally different values when inference is repeated
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Good afternoon,
I've been using semanticseg to apply a previously trained DAG network model to segment a test set of images. I've noticed that when repeatedly applying the same model to exactly the same image (with the same random seed, unless I am somehow setting this improperly) the output segmentation can differ very slightly. When using [~,~,allScores] = semanticseg(image, network) to view the softmax probability values output by the network per-pixel and per-class, these values do seem to be able to change each time the network is applied. The difference is very small, the values I've inspected seem to be consistent to at least 5 decimal places between segmentations, but I'm curious to know where this small difference is produced since I expected the inference procedure to be entirely deterministic. Could the use of the parallel computing toolbox influence the values in this way?
Thank you for any help you can provide; I hope I haven't overlooked an obvious answer. Fortunately, this small difference in probability score is only pronounced enough to shift the final pixel classification very rarely, on the order of magnitude of something like 1/1,000,000,000 pixels, and so only in cases where the output class probabilities are practically tied to begin with. I've included one such rare example below, in which the maximum-likelihood class for the pixel shifts in response to this variation.
Considering 9 classes, note the first and the last class:
Run 1:
[0.499402880668640, 8.857092470861971e-04, 9.613345497427872e-08, 1.140553695933022e-08, 3.467669529300110e-08, 2.951414890262072e-09, 6.419115834432887e-07, 3.072153194807470e-04, 0.499403357505798]
Run 2:
[0.499403357505798, 8.857083739712834e-04, 9.613336970915043e-08, 1.140553695933022e-08, 3.467669529300110e-08, 2.951414890262072e-09, 6.419103897314926e-07, 3.072150284424424e-04, 0.499402880668640]
Thanks again!
2 commentaires
awezmm
le 3 Jan 2020
Good morning Topological Sphere,
How are you setting the random seed?
best
-awezmm
TopologicalSphere
le 23 Jan 2020
Modifié(e) : TopologicalSphere
le 23 Jan 2020
Réponses (2)
Mahesh Taparia
le 6 Jan 2020
Hi
The slight difference in probability may be because of incorrect use of random seed or you are using dropout layer in the network which is selecting the weights randomly with the specified probability and resulting in minor difference.
3 commentaires
Greg Heath
le 12 Jan 2020
Use
clear all, close all, clc, rng(0)
on the 1st line
Thank you for formally accepting my answer
Greg
Voir également
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!