Skip to content

Commit b87f566

Browse files
author
mcimpoi
committed
Code for Deep Filterbanks paper
0 parents  commit b87f566

35 files changed

+2956
-0
lines changed

LatexTable.m

+68
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,68 @@
1+
classdef LatexTable < handle
2+
properties
3+
rowSuffixes
4+
rowPrefixes
5+
colPrefixes
6+
entries
7+
r
8+
c
9+
end
10+
11+
methods
12+
function obj = begin(obj)
13+
obj.rowSuffixes = {} ;
14+
obj.rowPrefixes = {''} ;
15+
obj.colPrefixes = {''} ;
16+
obj.entries = {} ;
17+
obj.r = 1 ;
18+
obj.c = 1 ;
19+
end
20+
function obj = pf(obj, str, varargin)
21+
obj.entries{obj.r,obj.c} = sprintf(str, varargin{:}) ;
22+
obj.c = obj.c + 1 ;
23+
if obj.r == 1, obj.colPrefixes{obj.c} = '' ; end
24+
end
25+
function obj = endl(obj)
26+
obj.rowSuffixes{obj.r} = sprintf('\\\\\n') ;
27+
obj.c = 1 ;
28+
obj.r = obj.r + 1 ;
29+
obj.rowPrefixes{obj.r} = '' ;
30+
end
31+
function obj = vline(obj)
32+
obj.colPrefixes{obj.c} = horzcat(obj.colPrefixes{obj.c}, '|') ;
33+
end
34+
function obj = hline(obj)
35+
obj.rowPrefixes{obj.r} = horzcat(obj.rowPrefixes{obj.r}, sprintf('\\hline\n')) ;
36+
end
37+
function move(obj, r, c)
38+
obj.r = r ;
39+
obj.c = c ;
40+
end
41+
function str = end(obj)
42+
str = {} ;
43+
nc = size(obj.entries,2) ;
44+
nr = size(obj.entries,1) ;
45+
sizes = cellfun(@(x) numel(x), obj.entries) ;
46+
widths = max(sizes,[],1) ;
47+
48+
str{end+1} = '\begin{tabular}{' ;
49+
for c=1:nc
50+
str{end+1} = [obj.colPrefixes{c} 'c'] ;
51+
end
52+
str{end+1} = sprintf('%s}\n', obj.colPrefixes{nc+1}) ;
53+
54+
for r = 1:nr
55+
str{end+1} = obj.rowPrefixes{r} ;
56+
for c = 1:nc
57+
format = sprintf('%%%ds', widths(c)) ;
58+
if c > 1, format = [' &' format] ; end
59+
str{end+1} = sprintf(format, obj.entries{r,c}) ;
60+
end
61+
str{end+1} = obj.rowSuffixes{r} ;
62+
end
63+
str{end+1} = obj.rowPrefixes{nr+1} ;
64+
str{end+1} = sprintf('\\end{tabular}\n') ;
65+
str = horzcat(str{:}) ;
66+
end
67+
end
68+
end

README.md

+80
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,80 @@
1+
# Deep filter banks for texture recognition, description, and segmentation
2+
3+
The code provided runs the evaluation of RCNN and FV-CNN on various texture and material datasets (DTD, FMD, KTH-TIPS2b, ALOT), as well as for other domains: object (VOC07), scene (MIT Indoor), and fine-grained (CUB).
4+
The results of these experiments are contained in Table 1 and 2 in the paper ** Deep Filter Banks for Texture Recognition and Segmentation, M. Cimpoi et al., CVPR 2015. ** and Tables 3, 4, 5, 6 in the paper **Deep filter banks for texture recognition, description, and segmentation, M. Cimpoi et al., http://arxiv.org/abs/1507.02620
5+
6+
## Getting starded
7+
8+
Once you have downloaded the code, make sure you installed the dependencies (see below).
9+
Download the datasets you want to evaluate on, and link to them or copy them under data folder, in the location of your repository. Download the models (VGG-M, VGG-VD and AlexNet) in data/models. It is slightly faster to download them manually from here: http://www.vlfeat.org/matconvnet/pretrained/
10+
Once done, simply run the `run_experiments.m` file.
11+
12+
In `texture_experiments.m` you could remove (or add) dataset names to the `datasetList` cell. Make sure you adjust the number of splits accordingly. The datasets are specified as {'dataset_name', <num_splits>} cells.
13+
14+
### Dependencies
15+
16+
The code relies on [vlfeat](http://www.vlfeat.org/), and [matconvnet], which should be downloaded and built before running the experiments.
17+
Run git submodule update -i in the repository download folder.
18+
19+
To build vlfeat, go to <DEEP-FBANKS_DIR>/vlfeat and run make; ensure you have MATLAB executable and mex in the path.
20+
To build matconvnet: in MATLAB, go to <DEEP-FBANKS_DIR>/matconvnet/matlab and run vl_compilenn; ensure you have CUDA installed, and nvcc in the path.
21+
22+
For LLC features (Table 3 in arxiv paper), please download the code from [http://www.robots.ox.ac.uk/~vgg/software/enceval_toolkit](http://www.robots.ox.ac.uk/~vgg/software/enceval_toolkit) and copy the following to the code folder (no subfolders!)
23+
24+
* `enceval/enceval-toolkit/+featpipem/+lib/LLCEncode.m`
25+
* `enceval/enceval-toolkit/+featpipem/+lib/LLCEncodeHelper.cpp`
26+
* `enceval/enceval-toolkit/+featpipem/+lib/annkmeans.m`
27+
28+
Create the corresponding dcnnllc encoder type (see the examples provided in run_experiments.m for BOVW, VLAD or FV).
29+
30+
### Paths and datasets
31+
32+
The `<DATASET_NAME>_get_database_.m` files generate the imdb file for each dataset. Make sure the datasets are copied or linked to manually in the data folder.
33+
34+
The datasets are stored in individual folders under data, in the current code folder, and experiment results are stored in data/exp01 folder, in the same location as the code. Alternatively, you could make data and experiments
35+
symbolic links pointing to convenient locations.
36+
37+
Please be aware that the descriptors are stored on disk (in cache folder, under `data/exp01/<experiment-dir>`), and may require large amounts of free space (especially FV-CNN features).
38+
39+
40+
### Dataset and evaluation
41+
42+
Describable Textures Dataset (DTD) is publicly available for download at:
43+
[http://www.robots.ox.ac.uk/~vgg/data/dtd](http://www.robots.ox.ac.uk/~vgg/data/dtd). You can also download the precomputed DeCAF features for DTD, the paper and evaluation results.
44+
45+
Our additional annotations for OpenSurfaces dataset are publicly available for download at:
46+
http://www.robots.ox.ac.uk/~vgg/data/wildtex/
47+
48+
Code is available at:
49+
TODO: GITHUB LINK
50+
51+
Code for CVPR14 paper (and Table 2 in arXiv paper):
52+
http://www.robots.ox.ac.uk/~vgg/data/dtd/download/desctex.tar.gz
53+
54+
### Citation
55+
56+
If you use the code and data please cite the following in your work:
57+
58+
FV-CNN and OpenSurfaces Additional Annotations:
59+
@Article{Cimpoi15a,
60+
Author = "Cimpoi, M. and Maji, S., Kokkinos, I. and Vedaldi, A.",
61+
Title = "Deep Filter Banks for Texture Recognition, Description, and Segmentation"
62+
Journal = "arXiv preprint arXiv:1507.02620",
63+
Year = "2015",
64+
}
65+
66+
@InProceedings{Cimpoi15,
67+
Author = "Cimpoi, M. and Maji, S. and Vedaldi, A.",
68+
Title = "Deep Filter Banks for Texture Recognition and Segmentation",
69+
Booktitle = "IEEE Conference on Computer Vision and Pattern Recognition",
70+
Year = "2015",
71+
}
72+
73+
DTD Dataset and IFV + DeCAF:
74+
@inproceedings{cimpoi14describing,
75+
Author = "M. Cimpoi and S. Maji and I. Kokkinos and S. Mohamed and A. Vedaldi",
76+
Title = "Describing Textures in the Wild",
77+
Booktitle = "Proceedings of the {IEEE} Conf. on Computer Vision and Pattern Recognition ({CVPR})",
78+
Year = "2014",
79+
}
80+

alot_get_database.m

+42
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
function imdb = alot_get_database(alotDir, varargin)
2+
opts.seed = 0 ;
3+
opts.version = 'grey2';
4+
opts = vl_argparse(opts, varargin) ;
5+
6+
rng(opts.seed, 'twister');
7+
8+
imdb.imageDir = fullfile(alotDir, opts.version) ;
9+
imdb.maskDir = fullfile(alotDir, 'mask'); % DO NOT USE
10+
11+
cats = dir(imdb.imageDir);
12+
cats = cats([cats.isdir] & ~ismember({cats.name}, {'.','..'})) ;
13+
imdb.classes.name = {cats.name} ;
14+
imdb.images.id = [] ;
15+
16+
for c=1:numel(cats)
17+
ims = dir(fullfile(imdb.imageDir, imdb.classes.name{c}, '*.png'));
18+
%imdb.images.name{c} = fullfile(imdb.classes.name{c}, {ims.name}) ;
19+
imdb.images.name{c} = cellfun(@(S) fullfile(imdb.classes.name{c}, S), ...
20+
{ims.name}, 'Uniform', 0);
21+
imdb.images.label{c} = c * ones(1,numel(ims)) ;
22+
if numel(ims) ~= 100, error('ops') ; end
23+
% http://cmp.felk.cvut.cz/~sulcmila/papers/Sulc-TR-2014-12.pdf
24+
% uses 20 for training, 80 for testing, per class
25+
sets = [1 * ones(1,20), 3 * ones(1,80)];
26+
imdb.images.set{c} = sets(randperm(100)) ;
27+
end
28+
imdb.images.name = horzcat(imdb.images.name{:}) ;
29+
imdb.images.label = horzcat(imdb.images.label{:}) ;
30+
imdb.images.set = horzcat(imdb.images.set{:}) ;
31+
imdb.images.id = 1:numel(imdb.images.name) ;
32+
33+
imdb.segments = imdb.images ;
34+
imdb.segments.imageId = imdb.images.id ;
35+
imdb.segments.mask = strrep(imdb.images.name, 'image', 'mask') ;
36+
37+
% make this compatible with the OS imdb
38+
imdb.meta.classes = imdb.classes.name ;
39+
imdb.meta.inUse = true(1,numel(imdb.meta.classes)) ;
40+
imdb.segments.difficult = false(1, numel(imdb.segments.id)) ;
41+
42+

compute_confusion.m

+13
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
function [c, acc] = compute_confusion(numClasses, gts, preds, areas, doNotNormalizePerClass)
2+
if ~exist('doNotNormalizePerClass')
3+
doNotNormalizePerClass = false ;
4+
end
5+
if nargin <= 3, areas = ones(size(gts)) ; end
6+
c = accumarray([gts(:), preds(:)], areas(:), numClasses*[1,1]) ;
7+
if ~doNotNormalizePerClass
8+
c = bsxfun(@times, 1./sum(c,2), c) ;
9+
acc = mean(diag(c)) ;
10+
else
11+
c = c / sum(c(:)) ;
12+
acc = sum(diag(c)) ;
13+
end

cub_get_database.m

+47
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,47 @@
1+
function imdb = cub_get_database(cubDir, useCropped)
2+
3+
if (nargin < 2)
4+
useCropped = false;
5+
end
6+
% Automatically change directories
7+
if useCropped
8+
imdb.imageDir = fullfile(cubDir, 'images_cropped') ;
9+
else
10+
imdb.imageDir = fullfile(cubDir, 'images');
11+
end
12+
13+
imdb.maskDir = fullfile(cubDir, 'masks'); % doesn't exist
14+
imdb.sets = {'train', 'val', 'test'};
15+
16+
% Class names
17+
[~, classNames] = textread(fullfile(cubDir, 'classes.txt'), '%d %s');
18+
imdb.classes.name = horzcat(classNames(:));
19+
20+
% Image names
21+
[~, imageNames] = textread(fullfile(cubDir, 'images.txt'), '%d %s');
22+
imdb.images.name = imageNames;
23+
imdb.images.id = (1:numel(imdb.images.name));
24+
25+
% Class labels
26+
[~, classLabel] = textread(fullfile(cubDir, 'image_class_labels.txt'), '%d %d');
27+
imdb.images.label = reshape(classLabel, 1, numel(classLabel));
28+
29+
% Bounding boxes
30+
[~,x, y, w, h] = textread(fullfile(cubDir, 'bounding_boxes.txt'), '%d %f %f %f %f');
31+
imdb.images.bounds = round([x y x+w-1 y+h-1]');
32+
33+
% Image sets
34+
[~, imageSet] = textread(fullfile(cubDir, 'train_test_split.txt'), '%d %d');
35+
imdb.images.set = zeros(1,length(imdb.images.id));
36+
imdb.images.set(imageSet == 1) = 1;
37+
imdb.images.set(imageSet == 0) = 3;
38+
39+
% Write out the segments
40+
imdb.segments = imdb.images;
41+
imdb.segments.imageId = imdb.images.id;
42+
imdb.segments.mask = strrep(imdb.images.name, 'image', 'mask');
43+
44+
% make this compatible with the OS imdb
45+
imdb.meta.classes = imdb.classes.name ;
46+
imdb.meta.inUse = true(1,numel(imdb.meta.classes)) ;
47+
imdb.segments.difficult = false(1, numel(imdb.segments.id)) ;

curet_get_database.m

+39
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,39 @@
1+
function imdb = curet_get_database(curetDir, varargin)
2+
opts.seed = 0 ;
3+
opts = vl_argparse(opts, varargin) ;
4+
5+
rng(opts.seed, 'twister') ;
6+
7+
imdb.imageDir = fullfile(curetDir, 'curetcol') ;
8+
imdb.maskDir = fullfile(curetDir, 'mask'); % DO NOT USE
9+
10+
cats = dir(imdb.imageDir);
11+
cats = cats([cats.isdir] & ~ismember({cats.name}, {'.','..'})) ;
12+
imdb.classes.name = {cats.name} ;
13+
imdb.images.id = [] ;
14+
15+
for c=1:numel(cats)
16+
ims = dir(fullfile(imdb.imageDir, imdb.classes.name{c}, '*.png'));
17+
%imdb.images.name{c} = fullfile(imdb.classes.name{c}, {ims.name}) ;
18+
imdb.images.name{c} = cellfun(@(S) fullfile(imdb.classes.name{c}, S), ...
19+
{ims.name}, 'Uniform', 0);
20+
imdb.images.label{c} = c * ones(1,numel(ims)) ;
21+
if numel(ims) ~= 92, error('ops') ; end
22+
sets = [1 * ones(1,46), 3 * ones(1,46)];
23+
imdb.images.set{c} = sets(randperm(92)) ;
24+
end
25+
imdb.images.name = horzcat(imdb.images.name{:}) ;
26+
imdb.images.label = horzcat(imdb.images.label{:}) ;
27+
imdb.images.set = horzcat(imdb.images.set{:}) ;
28+
imdb.images.id = 1:numel(imdb.images.name) ;
29+
30+
imdb.segments = imdb.images ;
31+
imdb.segments.imageId = imdb.images.id ;
32+
imdb.segments.mask = strrep(imdb.images.name, 'image', 'mask') ;
33+
34+
% make this compatible with the OS imdb
35+
imdb.meta.classes = imdb.classes.name ;
36+
imdb.meta.inUse = true(1,numel(imdb.meta.classes)) ;
37+
imdb.segments.difficult = false(1, numel(imdb.segments.id)) ;
38+
39+

0 commit comments

Comments
 (0)