I got error while I train model by using LanguageModelFeaturizer with bert. Which success on Rasa 2.x. Anyone know the solution? Thanks.
Here’s the information
All the layers of TFBertModel were initialized from the model checkpoint at rasa/LaBSE.
If your task is similar to the task the model of the checkpoint was trained on, you can already use TFBertModel for predictions without further training.
Traceback (most recent call last):
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/rasa/engine/graph.py", line 461, in __call__
output = self._fn(self._component, **run_kwargs)
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/rasa/nlu/featurizers/dense_featurizer/lm_featurizer.py", line 731, in process_training_data
batch_docs = self._get_docs_for_batch(batch_messages, attribute)
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/rasa/nlu/featurizers/dense_featurizer/lm_featurizer.py", line 689, in _get_docs_for_batch
batch_token_ids, batch_tokens, batch_examples, attribute, inference_mode
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/rasa/nlu/featurizers/dense_featurizer/lm_featurizer.py", line 620, in _get_model_features_for_batch
batch_attention_mask, padded_token_ids
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/rasa/nlu/featurizers/dense_featurizer/lm_featurizer.py", line 470, in _compute_batch_sequence_features
np.array(padded_token_ids), attention_mask=np.array(batch_attention_mask)
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/keras/engine/base_layer.py", line 1037, in __call__
outputs = call_fn(inputs, *args, **kwargs)
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/transformers/models/bert/modeling_tf_bert.py", line 1143, in call
training=inputs["training"],
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/keras/engine/base_layer.py", line 1037, in __call__
outputs = call_fn(inputs, *args, **kwargs)
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/transformers/models/bert/modeling_tf_bert.py", line 803, in call
attention_mask_shape = shape_list(inputs["attention_mask"])
File "/home/hko/jonathan/venv2/lib/python3.7/site-packages/transformers/modeling_tf_utils.py", line 1831, in shape_list
static = tensor.shape.as_list()
AttributeError: 'tuple' object has no attribute 'as_list'
pipeline:
- name: "SpacyNLP"
model: "zh_core_web_trf"
- name: "SpacyTokenizer"
- name: RegexFeaturizer
- name: LexicalSyntacticFeaturizer
- name: CountVectorsFeaturizer
- name: CountVectorsFeaturizer
analyzer: char_wb
min_ngram: 1
max_ngram: 4
- name: LanguageModelFeaturizer
model_name: "bert"
#model_name: "bert"
#model_weights: "rasa/LaBSE"
#model_weights: "bert-base-multilingual-uncased"
cache_dir: null
- name: DIETClassifier
epochs: 100
- name: EntitySynonymMapper
- name: ResponseSelector
epochs: 100
- name: FallbackClassifier
threshold: 0.3
ambiguity_threshold: 0.1