Few-Shot Learning
The ability of a model to learn new tasks or recognize new categories from just a handful of examples (typically 1-5), rather than requiring thousands of labeled samples.
In-Context Learning
Large language models demonstrate few-shot learning through in-context learning: you provide a few examples in the prompt, and the model generalizes the pattern to new inputs. No weight updates required -- it's purely based on the examples in the context.
Approaches
Meta-learning: Training the model to be good at learning from few examples (MAML, Prototypical Networks). Transfer learning: Pre-training on large datasets, then adapting with few examples. Prompt engineering: Crafting effective few-shot prompts for LLMs.
Significance
Few-shot learning dramatically reduces the data requirements for new tasks, making AI practical for domains where labeled data is expensive or scarce (medical, legal, specialized scientific fields).