SciBert : A Pretrained Language Model for Scientific Text - ar5iv
https://ar5iv.labs.arxiv.org/html/1903.10676
WebMar 2, 2024 · We release SciBert, a pretrained language model based on Bert Devlin et al. to address the lack of high-quality, large-scale labeled scientific data. SciBert leverages unsupervised pretraining on a large multi-domain corpus of scientific publications to improve performance on downstream scientific NLP tasks.
DA: 97 PA: 5 MOZ Rank: 19