Incremental multi-domain learning with network latent tensor factorization
The prominence of deep learning, large amount of annotated data and increasingly powerful hardware made it possible to reach remarkable performance for supervised classification tasks, in many cases saturating the training sets. However, adapting the learned classification to new domains remains a hard problem due to at least three reasons: (1) the domains and the tasks might be drastically different; (2) there might be very limited amount of annotated data on the new domain and (3) full training of a new model for each new task is prohibitive in terms of memory, due to the shear number of parameter of deep networks. Instead, new tasks should be learned incrementally, building on prior knowledge from already learned tasks, and without catastrophic forgetting, i.e. without hurting performance on prior tasks. To our knowledge this paper presents the first method for multi-domain/task learning without catastrophic forgetting using a fully tensorized architecture. Our main contribution is a method for multi-domain learning which models groups of identically structured blocks within a CNN as a high-order tensor. We show that this joint modelling naturally leverages correlations across different layers and results in more compact representations for each new task/domain over previous methods which have focused on adapting each layer separately. We apply the proposed method to 10 datasets of the Visual Decathlon Challenge and show that our method offers on average about 7.5x reduction in number of parameters and superior performance in terms of both classification accuracy and Decathlon score. In particular, our method outperforms all prior work on the Visual Decathlon Challenge.
NurtureToken New!

Token crowdsale for this paper ends in

Buy Nurture Tokens

Authors

Are you an author of this paper? Check the Twitter handle we have for you is correct.

Adrian Bulat (edit)
Jean Kossaifi (edit)
Georgios Tzimiropoulos (add twitter)
Maja Pantic (edit)
Ask The Authors

Ask the authors of this paper a question or leave a comment.

Read it. Rate it.
#1. Which part of the paper did you read?

#2. The paper contains new data or analyses that is openly accessible?
#3. The conclusion is supported by the data and analyses?
#4. The conclusion is of scientific interest?
#5. The result is likely to lead to future research?

Github
User:
None (add)
Repo:
None (add)
Stargazers:
0
Forks:
0
Open Issues:
0
Network:
0
Subscribers:
0
Language:
None
Youtube
Link:
None (add)
Views:
0
Likes:
0
Dislikes:
0
Favorites:
0
Comments:
0
Other
Sample Sizes (N=):
Inserted:
Words Total:
Words Unique:
Source:
Abstract:
None
04/14/19 06:00PM
6,864
2,098
Tweets
JeanKossaifi: @zacharylipton @furlanel I have a paper on this: https://t.co/nyLYwnA87A We learn a core tensor parametrizing the net on the source domain and specialise it for each new task. The linear case you mention sounds like the work by Rebuffi, which we compare to in the paper.
JeanKossaifi: Incremental multi-domain learning with network latent tensor factorization. New work w/ @AdrianBulat, Yorgos & @MajaPantic70: a domain agnostic Tucker core parametrizes the DCNN & is specialised w/ domain specific sets of factors. SOTA on Visual Decathlon https://t.co/nyLYwnA87A
arxiv_cscv: Incremental multi-domain learning with network latent tensor factorization https://t.co/P8FHXNMbyn
arxiv_cscv: Incremental multi-domain learning with network latent tensor factorization https://t.co/P8FHXO3MWX
arxivml: "Incremental multi-domain learning with network latent tensor factorization", Adrian Bulat, Jean Kossaifi, Georgios… https://t.co/x5oexTRPo4
arxiv_cscv: Incremental multi-domain learning with network latent tensor factorization https://t.co/P8FHXNMbyn
arxiv_cs_LG: Incremental multi-domain learning with network latent tensor factorization. Adrian Bulat, Jean Kossaifi, Georgios Tzimiropoulos, and Maja Pantic https://t.co/BlHGmTcOGo
BrundageBot: Incremental multi-domain learning with network latent tensor factorization. Adrian Bulat, Jean Kossaifi, Georgios Tzimiropoulos, and Maja Pantic https://t.co/ABcG3GY68m
Images
Related