@Chaitanya you inform me if you find a solution please?
I am sure you must have done this. Your labels.txt
should have all the labels in separate lines one after another.
yes i have done it is just to show you the file i’ve copied the continent and it gave this
@m2cci-sba can you share what the labeled_data
looks like?
labeled_data = [(t, x) for t,x in zip(processed_tokens, labels)] print("labeled_data : ") print(labeled_data) self.clf = NaiveBayesClassifier.train(labeled_data)
I mean, what’s the output if you print labeled_data between those 2 lines?
SOLVED
I was also getting a similar error… my classifier is not set up for Sentiment, but a similar function and similar pipeline. There were a few pieces to this puzzle.
First, ensure that you require “tokens” within your component. Otherwise you will not be able to reference .get(‘tokens’) during training. If you’d like to ignore tokens, then you can use the simple workaround below.
training_data.training_examples[SOME VALUE].get(‘tokens’)
OLD:
training_data = training_data.training_examples #list of Message objects
tokens = [list(map(lambda x: x.text, t.get('tokens'))) for t in training_data] #HERE
processed_tokens = [self.preprocessing(t) for t in tokens]
labeled_data = [(t, x) for t,x in zip(processed_tokens, labels)]
self.clf = NaiveBayesClassifier.train(labeled_data)
NEW
training_data = training_data.training_examples
tokens = [t.text.split() for t in training_data] #HERE
processed_tokens = [self.preprocessing(t) for t in tokens]
labeled_data = [(t, x) for t,x in zip(processed_tokens, labels)]
self.clf = NaiveBayesClassifier.train(labeled_data)
Secondly, the reason you are getting the same value is because the model is not saving/loading correctly. Turns out that rasa.nlu.utils is the culprit of this problem. Originally, the Sentiment Analysis component used utils.json_pickle and utils.json_unpickle to save/load the classifier. I’m not sure why this code suddenly broke but it did.
SOLUTION:
Use pickle vs rasa.nlu.utils to save/load. See the code below for persist/load
def _write_model(self, model_file, classifier):
save_classifier = open(model_file,"wb")
pickle.dump(classifier, save_classifier)
save_classifier.close()
def persist(
self,
file_name: Text,
model_dir: Text
) -> Optional[Dict[Text, Any]]:
"Persist this model into the passed directory."
if self.clf:
model_file_name = os.path.join(model_dir, MODEL_FILE_NAME)
self._write_model(model_file_name, self.clf)
return {"domain_classifier_model": MODEL_FILE_NAME}
@classmethod
def load(
cls,
meta: Dict[Text, Any],
model_dir: Text = None,
model_metadata: Metadata = None,
cached_component: Optional["YOURCLASSNAME"] = None,
**kwargs: Any
) -> "YOURCLASSNAME":
file_name = meta.get("classifier_model")
classifier_file = os.path.join(model_dir, file_name)
if os.path.exists(classifier_file):
classifier_f = open(classifier_file, "rb")
clf = pickle.load(classifier_f)
classifier_f.close()
return cls(meta, clf)
else:
return cls(meta)
Goodluck
@Collen-Roller Thank you very much for your answer !
I’m getting this error do you have any idea ?
2019-08-07 13:37:23 INFO root - Starting Rasa Core server on http://localhost:5005 initialised the class 2019-08-07 13:37:46 INFO rasa.nlu.components - Added ‘SpacyNLP’ to component cache. Key ‘SpacyNLP-fr_core_news_md’. 2019-08-07 13:38:02 INFO rasa.nlu.components - Added ‘SpacyNLP’ to component cache. Key ‘SpacyNLP-fr_core_news_md’. 2019-08-07 13:38:02 ERROR rasa.core.agent - Could not load model due to join() argument must be str or bytes, not ‘NoneType’. [2019-08-07 13:38:02 +0200] [4888] [ERROR] Experienced exception while trying to serve Traceback (most recent call last): File “c:\users\sbaja\appdata\local\programs\python\python36\lib\site-packages\sanic\app.py”, line 1096, in run serve(**server_settings) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\site-packages\sanic\server.py”, line 742, in serve trigger_events(before_start, loop) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\site-packages\sanic\server.py”, line 604, in trigger_events loop.run_until_complete(result) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\asyncio\base_events.py”, line 467, in run_until_complete return future.result() File “c:\users\sbaja\rasa\rasa\core\run.py”, line 188, in load_agent_on_start action_endpoint=endpoints.action, File “c:\users\sbaja\rasa\rasa\core\agent.py”, line 233, in load_agent remote_storage=remote_storage, File “c:\users\sbaja\rasa\rasa\core\agent.py”, line 878, in load_local_model remote_storage=remote_storage, File “c:\users\sbaja\rasa\rasa\core\agent.py”, line 364, in load interpreter = NaturalLanguageInterpreter.create(nlu_model) File “c:\users\sbaja\rasa\rasa\core\interpreter.py”, line 45, in create return RasaNLUInterpreter(model_directory=obj) File “c:\users\sbaja\rasa\rasa\core\interpreter.py”, line 246, in init self._load_interpreter() File “c:\users\sbaja\rasa\rasa\core\interpreter.py”, line 264, in _load_interpreter self.interpreter = Interpreter.load(self.model_directory) File “c:\users\sbaja\rasa\rasa\nlu\model.py”, line 301, in load return Interpreter.create(model_metadata, component_builder, skip_validation) File “c:\users\sbaja\rasa\rasa\nlu\model.py”, line 328, in create component_meta, model_metadata.model_dir, model_metadata, **context File “c:\users\sbaja\rasa\rasa\nlu\components.py”, line 433, in load_component component_meta, model_dir, model_metadata, cached_component, **context File “c:\users\sbaja\rasa\rasa\nlu\registry.py”, line 185, in load_component_by_meta component_meta, model_dir, metadata, cached_component, **kwargs File “C:\Users\sbaja\Desktop\gitlab\chatbot_nlp\sentiment.py”, line 135, in load classifier_file = os.path.join(model_dir, file_name) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\ntpath.py”, line 114, in join genericpath._check_arg_types(‘join’, path, *paths) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\genericpath.py”, line 149, in check_arg_types (funcname, s.class.name)) from None TypeError: join() argument must be str or bytes, not ‘NoneType’ Traceback (most recent call last): File “C:\Users\sbaja\AppData\Local\Programs\Python\Python36\Scripts\rasa-script.py”, line 11, in load_entry_point(‘rasa’, ‘console_scripts’, ‘rasa’)() File "c:\users\sbaja\rasa\rasa_main.py", line 70, in main cmdline_arguments.func(cmdline_arguments) File “c:\users\sbaja\rasa\rasa\cli\shell.py”, line 97, in shell rasa.cli.run.run(args) File “c:\users\sbaja\rasa\rasa\cli\run.py”, line 70, in run rasa.run(**vars(args)) File “c:\users\sbaja\rasa\rasa\run.py”, line 65, in run **kwargs File “c:\users\sbaja\rasa\rasa\core\run.py”, line 150, in serve_application app.run(host=“0.0.0.0”, port=port) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\site-packages\sanic\app.py”, line 1096, in run serve(**server_settings) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\site-packages\sanic\server.py”, line 742, in serve trigger_events(before_start, loop) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\site-packages\sanic\server.py”, line 604, in trigger_events loop.run_until_complete(result) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\asyncio\base_events.py”, line 467, in run_until_complete return future.result() File “c:\users\sbaja\rasa\rasa\core\run.py”, line 188, in load_agent_on_start action_endpoint=endpoints.action, File “c:\users\sbaja\rasa\rasa\core\agent.py”, line 233, in load_agent remote_storage=remote_storage, File “c:\users\sbaja\rasa\rasa\core\agent.py”, line 878, in load_local_model remote_storage=remote_storage, File “c:\users\sbaja\rasa\rasa\core\agent.py”, line 364, in load interpreter = NaturalLanguageInterpreter.create(nlu_model) File “c:\users\sbaja\rasa\rasa\core\interpreter.py”, line 45, in create return RasaNLUInterpreter(model_directory=obj) File “c:\users\sbaja\rasa\rasa\core\interpreter.py”, line 246, in init self._load_interpreter() File “c:\users\sbaja\rasa\rasa\core\interpreter.py”, line 264, in _load_interpreter self.interpreter = Interpreter.load(self.model_directory) File “c:\users\sbaja\rasa\rasa\nlu\model.py”, line 301, in load return Interpreter.create(model_metadata, component_builder, skip_validation) File “c:\users\sbaja\rasa\rasa\nlu\model.py”, line 328, in create component_meta, model_metadata.model_dir, model_metadata, **context File “c:\users\sbaja\rasa\rasa\nlu\components.py”, line 433, in load_component component_meta, model_dir, model_metadata, cached_component, **context File “c:\users\sbaja\rasa\rasa\nlu\registry.py”, line 185, in load_component_by_meta component_meta, model_dir, metadata, cached_component, **kwargs File “C:\Users\sbaja\Desktop\gitlab\chatbot_nlp\sentiment.py”, line 135, in load classifier_file = os.path.join(model_dir, file_name) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\ntpath.py”, line 114, in join genericpath._check_arg_types(‘join’, path, *paths) File “c:\users\sbaja\appdata\local\programs\python\python36\lib\genericpath.py”, line 149, in _check_arg_types (funcname, s.class.name)) from None TypeError: join() argument must be str or bytes, not ‘NoneType’
What is your model_dir and file_name?
Make sure you RETRAIN your model… that could be part of the issue…
Also, you’re going to want to update your init function to accept a classifier! You need to load back in the classifier you save when the load function is called.
def __init__(
self,
component_config: Optional[Dict[Text, Any]] = None,
clf = None
) -> None:
super(YOUR_CLASSIFIER, self).__init__(component_config)
self.clf = clf
Finally, make sure you have defined your classifier file somewhere… your Sentiment analyzer needs to know where to save/load the model to/from
@Collen-Roller Thank you for your response :)can you send me mail in this adress i need to ask you some questions; baja.sarra@gmail.com thank you very much
Feel free to comment on this form or start a new form to ask any addition questions!
how can i print the sentiment and confidence that was predicted?
@Collen-Roller your solution worked for me, but some of my doubts are do I have to change my def process method too? secondly, could you please send me the github repo of your rasa custom component code.