I want to use the Spacy model ‘zh_core_web_trf’ in Rasa2.4, but the it does not work well as what I thought before, and model ‘zh_core_web_lg’ woeks better than it, while as data shown in the Spacy web model ‘zh_core_web_trf’ performs better. This is a log message shown in the comand line. "rasa.nlu.featurizers.dense_featurizer.spacy_featurizer - no features present. you are using an empty spacy model." I install the model in the following steps: pip insall spacy==3.0.5 pip install (the path to zh_core_web_trf model) Please tell if I did something wrong. Below is my config in the file config.yml.
So there’s a few things to unpack here.
- The next big Rasa release (2.5) will start supporting spaCy 3.0. Part of what is going wrong here is that any version of spaCy before 3.0 doesn’t have the
trf(transformer) models properly built in.
- The current implementation of Rasa only grabs the
Doc.vectorproperty while what you’re probably after are the transformer tensors. This feature isn’t on the roadmap directly because spaCy uses huggingface models under the hood. Since we already support these models directly via our LanguageModelFeatuizers component we figured it would be best to keep the functionality in one component. If you’re curious to try it out, you might appreciate the
rasa/LaBSEsetting. It’s a multi-language model that supports Chinese as well, but you should also be able to attach a Bert model that’s optimised for Chinese as well.
So，Rasa 2.5 will support the
trf (transformer) models intergrated in spaCy 3.0, Right?
That’s not currently on the official Rasa roadmap, I might implement it over on the Rasa NLU Examples repo but I’d like to understand if spaCy does more than the huggingface models.