matlab - Best way to flatten a 2D matrix to 1D when sliced from a 3D matrix variable -


i have 3d matrix in matlab store sequence of 2d arrays. i'm having find maximal value , row , column indices, pretty straightforward single variable holds 2d array in

a = rand(10,10); [m,i] = max(a(:)); [i,j] = ind2sub( size(a) , ) 

the trouble cannot use syntax 3d matrix

a = rand(10,10,3); [m,i] = max( a(:,:,1)(:) ); [i,j] = ind2sub(size( a(:,:,1) ), )  error: ()-indexing must appear last in index expression. 

i create temporary variable store 2d slice, i'd thought i'd see if there's better means of doing this, maybe making call reshape? there way use simple linearizing/flattening operator (:) in context?

here's i'd do:

[b i]=max(reshape(a,[],size(a,3))); [ii,jj]=ind2sub(size(a),i ); 

the limitation wont treat cases there more 1 max per 2d slice.


Comments

Popular posts from this blog

python - pip install -U PySide error -

arrays - C++ error: a brace-enclosed initializer is not allowed here before ‘{’ token -

cytoscape.js - How to add nodes to Dagre layout with Cytoscape -