A complete knowledge distillation pipeline that compresses BERT (110M params) to DistilBERT (67M params) while maintaining high accuracy on SST-2 sentiment classification.
Abstract: Civil unrest, a major trouble in the country's progress, requires timely detection and prevention. It causes numerous major issues, including loss of life and injury, resource depletion, ...
This repository implements an end‑to‑end sentiment analysis system for Twitter‑style text using a fine‑tuned DistilBERT model, exposed via a FastAPI REST API and a Streamlit UI, fully containerized ...
Abstract: Transformers have caused a paradigm shift in tasks related to natural language. From text summarization to classification, these models have established new state-of-the-art results on ...