BERT¶
This module includes classes to enable compatibility with BERT models (Devlin et al., 2018) that are stored in the HuggingFace hub. For instance, in the work this package was originally developed for (Ulmer et al., 2022), three different BERTs were used:
The original, English BERT by Devlin et al. (2018) (bert-base-uncased).
The Danish BERT developed by Hvingelby et al. (2020) (alexanderfalk/danbert-small-cased).
The Finnish BERT provided by Virtanen et al. (2019) (TurkuNLP/bert-base-finnish-cased-v1).
The BERT model that is supposed to be used can be specified by using the bert_name argument for nlp_uncertainty_zoo.models.bert.BertModule.__init__()
.