Spacy Language module Issue : missing en_core_web_sm/tokenizer

I am trying to train the nlu for the starter pack. My company has some restrictions such that I cannot install the “en” language module with spacy because it pulls from an external source. So I manually downloaded the language module & installed it. Then in the init file I added the spacy load for it. But when running the train-nlu, I get a tokenizer not found (which in fact I don’t see in my language directory where it’s looking).

Any ideas?

Can you link to where you manually downloaded it?