## Merge 3D matrices into one cell array without using a loop

### mashtine (view profile)

on 24 Jun 2015
Latest activity Edited by Stephen Cobeldick

on 25 Jun 2015

### Stephen Cobeldick (view profile)

I have three 3D matrices (mxnxt) that I would like to merge into a single array of mxn with each array holding the corresponding data from the original matrices. Thus {1,1} of the new array will have a tx3 matrix containing the data.
For instance, A, B and C are 10x10x300 matrices. How do I make D to be a 10x10 array where {1,1} is a 300x3 matrix, without using a loop, but simple vector indexing.
Hope this makes sense!

### Stephen Cobeldick (view profile)

on 24 Jun 2015
Edited by Stephen Cobeldick

### Stephen Cobeldick (view profile)

on 24 Jun 2015

Try this:
A = rand(10,10,300);
B = rand(10,10,300);
C = rand(10,10,300);
D = permute(cat(4,A,B,C),[3,4,1,2]); % join together, re-orient
V = ones(1,10);
D = squeeze(mat2cell(D,300,3,V,V)); % split into cell
And the output:
>> size(D)
ans =
10 10
>> size(D{1,1})
ans =
300 3

Show 1 older comment
Stephen Cobeldick

### Stephen Cobeldick (view profile)

on 25 Jun 2015
"Is there a better way": yes, when you plan your data organisation so that this is not required. Whatever method you choose will require creating new arrays, concatenation, permuting, reshaping or other similar operations... the fastest way to achieve this it to avoid it altogether. Plan your data and programming become much easier.
In this case I would recommend the following:
• generate/preallocate one numeric array and store all of the data in it (not in three arrays).
• access the required parts using indexing.
This will be many times faster than any answer to your original question.
mashtine

### mashtine (view profile)

on 25 Jun 2015
Thanks Stephen,
I wish it were so simple however. I was very large files (a few GB each) with netcdf data that I need to do calculations on that aren't so straight forward. Hardware is the biggest issue and thus my splitting of arrays. My computer simply cannot handle one large array, that when all the data is stored in, will be approximately over 25 GB. I cannot even load that into my workspace. Unless again, I am missing something. Steep learning curves!
Stephen Cobeldick

### Stephen Cobeldick (view profile)

on 25 Jun 2015
There are basically three ways of dealing with Big Data:
1. Change the algorithm so that it is not necessary to hold all of the data in memory.
2. Use tools designed to operate on Big Data.
Do an [internet search engine] search for "MATLAB Big Data" and you will find lots of discussions on these topics. Well, mostly on the first two, but the third gets mentioned too!

on 25 Jun 2015
Edited by Matt J

### Matt J (view profile)

on 25 Jun 2015

There is no way to do it without for-loops. Note that mat2cell and friends are all mfiles that use loops internally.
The data organization you are pursuing is ill-advised. Instead of cell arrays, you should just cat() them into a 4D numeric array
D=cat(4,A,B,C);
Now to access a tx3 sub-array, you can do things like
squeeze(D(i,j,:,:))
It would have been much better and cleaner if you had instead made the original arrays tx1xmxn. That way, you could concatenate as
cat(2,A,B,C)
and your sub-arrays would be in more efficient memory-contiguous blocks and also more simply indexed as D(:,:,i,j). You can use permute() to achieve this, of course, but permute() is an expensive operation.