Capability
Zero Shot Natural Language Inference Classification
20 artifacts provide this capability.
Want a personalized recommendation?
Find the best match →Top Matches
via “zero-shot text classification via natural language inference”
zero-shot-classification model by undefined. 27,43,704 downloads.
Unique: Leverages BART's pre-training on denoising and seq2seq tasks combined with Multi-NLI fine-tuning to reformulate arbitrary classification as entailment reasoning, enabling true zero-shot capability without task-specific adaptation layers or fine-tuning
vs others: Outperforms GPT-2 and RoBERTa-based zero-shot classifiers on unseen categories due to explicit NLI training, while remaining 10-50x smaller and faster than GPT-3.5/4 APIs with no external dependencies