Set a matrix as index of map

3 views (last 30 days)
Alessandro
Alessandro on 19 Aug 2019
Commented: Guillaume on 20 Aug 2019
Hi, I am trying to make a map of a lot of convolutions to speed up my alghoritm. The arguments for the convolution are a binary matrix that differ in content and size and another filter matrix. More specifically this line of code:
numPixelsSubtracted = conv2(newPixelsCovered(y1:y2, x1:x2), mat.filters{r}, 'same');
So far I tried to make an index converting newPixelsCovered into a string using strcat (so much slooower than conv2) and adapting the Cantor function but in this case I got overlapping. Using a power function like overflows because I have big matrices. So, is there a method to store as index directly the matrix and then access to the convolution by the matrix? It hasn't to use specifically a containers.Map, it can use anything. I am not so used to all the matlab structures, so maybe it's a stupid question.
Also, I can't store pointers to the variables or similars because newPixelsCovered is recalculated and the same matrix won't correspond to the same pointer.
Thank you.
  3 Comments
Alessandro
Alessandro on 19 Aug 2019
Most of the questions are answered in the first sentence, "I am trying to make a map of a lot of convolutions to speed up my alghoritm". But I'll try to put it in another way: I want to store convolutions to use them if they are already calculated for the same matrix pattern (newPixelsCovered). To store them I came up with a container.Map object. The next step was to make a key/index that is simply computable when I have the pattern. However Matlab doesn't allow to set a matrix as key.
For the strcat part, if I do:
sprintf("%s", strcat(',', char(newPixelsCovered(y1:y2, x1:x2))))
then I can use the result as key.
The overlapping happens when I use Cantor function. I made a matrix of Cantor of size of newPixelsCovered and then made but some results are the same so I wrongly pick a convolution for another matrix.
If it's not clear yet, forget all, the question actually is simple, I have matrix A, I produce matrix B with matrix A, I want to store B in something (maybe a containers.Map but I am open to other solutions) that I can access through A in a fast way.
Adam
Adam on 19 Aug 2019
Take a look at
doc memoize
It sounds like it would help far better than a home-made version.

Sign in to comment.

Accepted Answer

Guillaume
Guillaume on 19 Aug 2019
Edited: Guillaume on 19 Aug 2019
Certainly, I would never had guessed from your initial description that you wanted to cache the output of some convolutions.
As Adam said, it's easily done in R2019a, use memoize (>=R2017a) which is designed exactly for this:
myconv2 = memoize(@conv2);
%you may want to increase the cache size (default: 10 combinations of inputs)
myconv2.CacheSize = 30;
myconv2(magic(10), ones(3), 'same') %execute and cache
myconv2(magic(10), eye(3), 'same') %execute and cache (since different inputs)
myconv2(magic(10), ones(3), 'same') %get result directly from the cache since inputs have already been used
You can see how well it performs and what's been cached with
myconv2.stats
  4 Comments
Adam
Adam on 20 Aug 2019
It is an unfortunate aspect of memoize that its cache size is only managed by number of things cached rather than by size. Despite my earlier comment that memoize would be 'far better', I did actually write my own caching class for an example I had to do because I needed to manage my cache size based on memory rather than number of elements cached. However, I'd still recommend memoize as a start point. If your convolutions are of fairly predictable sizes then you can manage the cache size decently with a simple calculation from the average size of a cached element that you expect.
In my case, looking back at the code, I ran a hash function on a class containing my input parameters to use a containers.map for my own cache. In my case, after trying more complicated hashing functions, just concatenating parameters into a string worked fine.
Guillaume
Guillaume on 20 Aug 2019
I think that memoize has been designed with small caches in mind. There's indeed no option to manage the memory used by the cache and cache lookup is certainly not optimised for large caches (lookup is O(n)). I would have hoped that if mathworks had large caches in mind they would have gone with a hash map indeed. Although with matlab not having hashing functions, it's probably not trivial to implement.
Even the way the circular buffer is implemented for the default caching mode is not very efficient. Each rotation is a complete rewrite of the whole buffer.
It's also disappointing that the caching policies are undocumented and hidden.
I supposed that if the performance of the menoizer is not good enough you could copy the two relevant classes (matlab.lang.internal.Memoizer and matlab.lang.MemoizedFunction) and make your own tailored version (or indeed start from scratch). You have to be careful that MemoizedFunction has some unusual constructs thanks to some of its hidden and undocumented superclasses, it does not overload subsref relying instead on matlab.mixin.internal.indexing.Paren and matlab.mixin.internal.indexing.DotNotOverloaded (why are these undocumented and hidden, they look very useful!).

Sign in to comment.

More Answers (0)

Categories

Find more on Matrices and Arrays in Help Center and File Exchange

Products


Release

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!