A complete knowledge distillation pipeline that compresses BERT (110M params) to DistilBERT (67M params) while maintaining high accuracy on SST-2 sentiment classification.
smart_contract_detection_xAI/ ├── config.py # Central configuration (paths, models, hyperparameters) ├── utils.py # Logging, GPU detection, timing, pickle caching ├── data_loader.py # Dataset loading, ...
Abstract: This paper presents a real-time email phishing detection system that utilizes a custom DistilBERT model. The custom DistilBERT architecture incorporates dynamic threshold adjustment and an ...
Abstract: We analyze public sentiment on the 17+8 movement (Aug 2025) using IndoBERT and DistilBERT. A corpus of 4,441 texts from Google News, YouTube, and Reddit was labeled via two LLMs (SahabatAI, ...