Can the Rasa-Core work on an embedded systems without GPU supporting?

Hi, there! Can the Rasa-Core work on an embedded systems without GPU supporting? For example, can I have an implementation for Rasa-Core on Resperry PI or BeagleBoard without GPU supporting? I don’t want to have a GPU support on it. Is this idea suitable for Rasa? Thanks a lot.

@cstsai Rasa doesn’t need GPU support. However, memory could be a limiting factor, but since your are not using NLU it should be fine (language models can be quite big). Are you using the embedded systems also to train or just for the inference?

Sorry for typo “Rasa-Core” above. Actually, I will have a porting for NLU and rasa’s dialog-managemnet (i.e. rasa-core) into embedded system. The embedded system will be an edge system. I will use Mitie trained model which is for Chinese (this is for NLU-stage). For my case, is enough for 4G’s memory? Moreover, if I consider only to have an inference for my embedded system, is it okay(enough)? The embedded system may be Resperry PI 4 or BeagleBoard. Thanks a lot.

Uff, the mitie language is huge, isn’t it? The problem is that the whole language model will be loaded into memory and then :boom:

I’d rather use a small sklearn model or train https://rasa.com/docs/rasa/nlu/components/#embeddingintentclassifier with a restricted vocabulary size / embedding size

Thanks for your advice. Currently, my Mitie’s trained model which is for Chinese in NLU-stage is about 332MB. In this case, is there still possible porting Rasa into embedded system(i.e. I may use Resperry PI 4 or BeagleBoard for 4G’s memory)? Considering the performance of embedded system, can Rasa system run as well?

Sorry for my late reply, I was on vacation.

Considering the performance of embedded system, can Rasa system run as well?

I think that could work, but you’d have to try it yourself. Would love to hear how it went!