Different results on different computers, Matlab 2021b - 64 bit, windows 64 bit, both Intel chips

31 vues (au cours des 30 derniers jours)
Hello together,
I was testing some code on two different machines, both 64 bit windows, both Matlab 2021b with 64 bit.
I was suprised, that a simple operation with the same variables, the same precision, reproduces slightly different results.
It is not a huge operation, actually just a matrix vector multiplication of a vector A with size 1x16 and a matrix B with 16x3, both in single format, resulting in a vector C of 1x3.
I tested the bit representation of both, the vector A and the matrix B entries, and they are exactly the same.
But when I perform the matrix vector multiplication C = A*B; , the first entry is different on the two machines.
The funny thing is, that, when I perform C(1) = A*B(:,1); I get the same value on both machines, and I get also the same value (but the other different result) when I perform C(1) = sum(A.*B(:,1)');
So summarized:
  • when I perform C = A*B, the first entries are different on the two machines ('10111101111101110110011011110110' and '10111101111101110110011011111000')
  • when I perform C(1) = A*B(:,1), the values are the same on the two machines ('10111101111101110110011011111000')
  • when I perform C(1) = sum(A.*B(:,1)'), the values are the same on the two machines ('10111101111101110110011011110110')
How does this come, and which value to trust?
Thanks!
  4 commentaires
Roman Foell
Roman Foell le 1 Déc 2021
Modifié(e) : Roman Foell le 1 Déc 2021
Edit: Just as information, I tested several such matrix vector multiplications with different vectors and but the same matrix (40000 overall) and nearly all of them gave different results.
James Tursa
James Tursa le 1 Déc 2021
Modifié(e) : James Tursa le 1 Déc 2021
Can you post some actual small examples that demonstrate this? Either post the hex versions of the numbers, or maybe a mat file. I.e., post something so that we can use the exact same numbers to start with.
Do the two machines use different BLAS libraries? Or maybe the floating point rounding mode is set differently on the two machines for some reason.

Connectez-vous pour commenter.

Réponse acceptée

Christine Tobler
Christine Tobler le 10 Déc 2021
Modifié(e) : Christine Tobler le 10 Déc 2021
First, about "which value to trust?"
Both values are equally trustworthy, the differences in results come down to applying the multiplications and additions of the matrix multiplication in different order. There is no right and wrong choice there, and so it's made based on performance considerations for each hardware architecture.
So for an individual matrix, you could use the symbolic toolbox and compare which machine got closer to the exact result, but this will come down to random luck for any specific input matrix.
The harder question is "how does this come?"
We make sure that MATLAB commands are reproducible, by which we mean: If you're on the same MATLAB version, same machine, same OS, same number of threads allowed for MATLAB to use, no change in any deep-down BIOS settings, the outputs of the same command with the exact same inputs are always the same.
To see what might be going on, can you run
version -blas
on your machine? This will tell us about which version of the library that we call for matrix multiplication has been chosen. I suspect they might be using different instruction set levels (e.g., AVX2 vs AVX512).
  7 commentaires
Andrew Roscoe
Andrew Roscoe le 2 Nov 2023
Modifié(e) : Andrew Roscoe le 2 Nov 2023
A further finding is that to get the same Simulink results I need to bdclose() the relevant model and re-open it just prior to the simulation, OR, start a fresh MATLAB session. Otherwise, "something" in the cached memory of Simulink can cause a set of simulations to produce different numerical results if you re-run the set of simulations in a different order to the first time.
Deleting the slxc files, and/or the contents of the slprj directory, seems to have no effect. It seems to be something in the memory of the Simulink session, related to the open model, that is relevant.
This is rather annoying, because, while the time penalty for constraining to AVX2 and a single thread is not too bad (10-20% simulation slowdown), the time penalty for not being able to benefit from reduced JIT acceleration times, for simulations run in a sequence, is very large. Often the JIT acceleration time is nearly 50% of the total time required. For a sequence of simulations, if the simulations use the same model (or nearly the same model), the JIT acceleration time can be dropped to almost-zero or even zero, if Simulink realises that the model is similar to the one it just simulated.
BUT, to get numerical reproducability I seem to have to bdclose() the model between every simulation, which means that the JIT acceleration takes the full time, every time, even if I leave the slprj directory intact.
Walter Roberson
Walter Roberson le 2 Nov 2023
To check:
If you set the random number seed to a constant before each run, does the same problem happen? (I assume here that even if you do not knowingly use random numbers, that something in your model might just be using random numbers.)
Something else that can cause subtle differences if if somehow the rounding mode got set. Rounding mode at the MATLAB level is not documented; it is set via system_dependent() or feature(); see https://undocumentedmatlab.com/articles/undocumented-feature-function

Connectez-vous pour commenter.

Plus de réponses (1)

Roman Foell
Roman Foell le 1 Déc 2021
Modifié(e) : Roman Foell le 2 Déc 2021
I attached the example variables A,B in my first post.
@James Tursa: How to check the BLAS setting? How to check the rounding setting for floating point?
Edit: Following https://de.mathworks.com/matlabcentral/answers/223952-configuration-of-lapack-and-blas-in-matlab the BLAS setting is dependent of the Matlab version, I used both Matlab 2021b.
  6 commentaires
Andres
Andres le 7 Déc 2021
I can also confirm different results on different computers.
Matlab Online gave the same results as in my previous comment, but
R2020a and R2021b on Intel(R) Core(TM) i5-7300U:
C1 = dec2bin(typecast( (A*B) * [1;0;0], 'uint32'))
C2 = dec2bin(typecast( A*B(:,1), 'uint32'))
C3 = dec2bin(typecast( sum(A.*B(:,1)'), 'uint32'))
N1 = (A*B) * [1;0;0] - A*B(:,1)
C1 =
'10111101111101110110011011111000'
C2 =
'10111101111101110110011011111000'
C3 =
'10111101111101110110011011110110'
N1 =
single
0
Roman Foell
Roman Foell le 7 Déc 2021
@Andres: Thanks, so actually the same as for me. Do you could figure out, from which this difference comes?

Connectez-vous pour commenter.

Catégories

En savoir plus sur Install Products dans Help Center et File Exchange

Produits


Version

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by