Transformer Encoder-based Forecasting Model

Please note: this transformer requires installing convokit with the optional LLM packages via pip install convokit[llm].

A ConvoKit Forecaster-adherent implementation of conversational forecasting model based on Transformer Encoder Model (e.g. BERT, RoBERTa, SpanBERT, DeBERTa). This class is first used in the paper “Conversations Gone Awry, But Then? Evaluating Conversational Forecasting Models”(Tran et al., 2025).

IMPORTANT NOTE: This implementation can, in fact, support any model compatible with HuggingFace’s AutoModelForSequenceClassification, including decoder-based models such as Gemma and LLaMA. However, we suggest using parameter-efficient fine-tuning (e.g., LoRA) techniques for large language models. To facilitate this, we provide a separate class specifically designed for decoder-based architectures.