Evaluation with Core


Can you do hyperparamter optimization with Core’s evaluation?

It would be nice to have the impact of a let’s say a categorical slot on next prediction in a story.

Sometimes just including a slot value doesn’t change anything in prediction. So it would be great to have an evaluation to analyse what are the features for predicting next actions.

there are a couple of different questions here.

you can use a library like hyperopt to do hyperparameter optimization. You’ll have to write a small script that wraps Rasa Core’s train and evaluate methods though.

understanding the impact of a single feature (say a categorical slot) on a prediction is a different topic. You could try something like LIME, although I know there are some other libraries for inspecting network sensitivities I don’t have experience with any of them