Tokenizer_spacy uses punctuation as tokens?


I just tried tokenizer_spacy instead of whitespace which I usually use, and recognised that the punctuations (.,?!) are used as tokens? That is bad for my case. Is this normal? Why? Can you switch this off?

spacy adds them as tokens, there is nothing we can do about it. You can try to ask on spacy forum