When I train the model with custom labels, the training code works well. However, Adapting Inference.py code to my custom trained model does not work.
I change the Inference.ipynb code to adapt my 11 labels as follows:
LABELS=["Adjective","API","Core","GUI","Hardware","Language","Platform","Standard","User","Verb","O"]
template_list=[" is a %s entity"%(e) for e in LABELS]
entity_dict={i:e for i, e in enumerate(LABELS)}
Here is loading checkpoint
tokenizer = BartTokenizer.from_pretrained('facebook/bart-large')
model = BartForConditionalGeneration.from_pretrained('./outputs/best_model')
Here is the inference and the error
prediction("As a user I should be able to use the attribute type User in my queries.")
RuntimeError
----> 2 prediction("As a user I should be able to use the attribute type User in my queries.")
/usr/local/lib/python3.7/dist-packages/transformers/models/bart/modeling_bart.py in _shape(self, tensor, seq_len, bsz)
157 def _shape(self, tensor: torch.Tensor, seq_len: int, bsz: int):
--> 158 return tensor.view(bsz, seq_len, self.num_heads, self.head_dim).transpose(1, 2).contiguous()
RuntimeError: shape '[88, -1, 16, 64]' is invalid for input of size 778240
When I train the model with custom labels, the training code works well. However, Adapting Inference.py code to my custom trained model does not work.
I change the Inference.ipynb code to adapt my 11 labels as follows:
Here is loading checkpoint
Here is the inference and the error