Drop_rate vs weight_sparcity

i was going through this awesome video: Rasa Algorithm Whiteboard - Diet Architecture 3: Benchmarking - YouTube

but I have a doubt what is difference between drop_rate vs weight_sparcity seems both drops some percentage of neurons in a Neural network, but not confirm why 2 different keys for that?

is it dropping different NN’s neurons? @akelad, @rctatman, @dakshvar22 if you could help?

@Ghostvv, @akelad could you help?

@omkarcpatil drop_rate randomly sets a subset of weights to zero. This random subset can change across different batches that are fed to the model. Whereas, weight_sparsity sets a pre-defined subset of weights to 0 at the beginning of training and that doesn’t change across batches. Weights that are set to 0 from weight sparsity remain at 0 at inference time as well as they are never tuned during training.

Thank you very much @dakshvar22, one last help could you tell me where I can find all the parameters for Diet Classifier (or any classifier)