Types of Transfer Learning
Types of Transfer Learning
Transfer learning can be categorized based on the relationship between source and target domains and tasks:
1. Inductive Transfer Learning:
- The target task is different from the source task;
- Example: using an image classifier trained on animals to classify medical images.
2. Transductive Transfer Learning:
- The source and target tasks are the same, but the domains differ;
- Example: sentiment analysis on English reviews (source) and French reviews (target).
3. Unsupervised Transfer Learning:
- Both tasks are unsupervised (e.g., clustering), but domains differ.
Notation:
DSTS=DT=TT(domains differ)(tasks differ)Feature reuse: often, early layers of a neural network learn general features (e.g., edges in images) that can be reused across tasks.
The choice of transfer learning type depends on how similar the source and target domains and tasks are.
¡Gracias por tus comentarios!
Pregunte a AI
Pregunte a AI
Pregunte lo que quiera o pruebe una de las preguntas sugeridas para comenzar nuestra charla
Awesome!
Completion rate improved to 9.09
Types of Transfer Learning
Desliza para mostrar el menú
Types of Transfer Learning
Transfer learning can be categorized based on the relationship between source and target domains and tasks:
1. Inductive Transfer Learning:
- The target task is different from the source task;
- Example: using an image classifier trained on animals to classify medical images.
2. Transductive Transfer Learning:
- The source and target tasks are the same, but the domains differ;
- Example: sentiment analysis on English reviews (source) and French reviews (target).
3. Unsupervised Transfer Learning:
- Both tasks are unsupervised (e.g., clustering), but domains differ.
Notation:
DSTS=DT=TT(domains differ)(tasks differ)Feature reuse: often, early layers of a neural network learn general features (e.g., edges in images) that can be reused across tasks.
The choice of transfer learning type depends on how similar the source and target domains and tasks are.
¡Gracias por tus comentarios!