TypeError: can't pickle _thread.RLock objects

I am using my own model architecture in keras policy. The network gets trained but the model does not get saved. It throws the error as follows,

    Traceback (most recent call last):
  File "C:\Apps\sa2446\Scripts\rasa-script.py", line 11, in <module>
    load_entry_point('rasa', 'console_scripts', 'rasa')()
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\__main__.py", line 76, in main
    cmdline_arguments.func(cmdline_arguments)
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\cli\train.py", line 77, in train
    kwargs=extract_additional_arguments(args),
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\train.py", line 40, in train
    kwargs=kwargs,
  File "c:\apps\sa2446\lib\asyncio\base_events.py", line 579, in run_until_complete
    return future.result()
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\train.py", line 87, in train_async
    kwargs,
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\train.py", line 169, in _train_async_internal
    kwargs=kwargs,
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\train.py", line 203, in _do_training
    kwargs=kwargs,
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\train.py", line 331, in _train_core_with_validated_data
    kwargs=kwargs,
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\core\train.py", line 67, in train
    agent.persist(output_path, dump_stories)
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\core\agent.py", line 793, in persist
    self.policy_ensemble.persist(model_path, dump_flattened_stories)
  File "C:\Users\SA2446\AppData\Roaming\Python\Python37\site-packages\rasa\core\policies\ensemble.py", line 180, in persist
    policy.persist(policy_path)
  File "C:\ICA\restaurantbot\custom_policy.py", line 406, in persist
    self.model.save(model_file, overwrite=True)
  File "c:\apps\sa2446\lib\site-packages\keras\engine\network.py", line 1090, in save
    save_model(self, filepath, overwrite, include_optimizer)
  File "c:\apps\sa2446\lib\site-packages\keras\engine\saving.py", line 382, in save_model
    _serialize_model(model, f, include_optimizer)
  File "c:\apps\sa2446\lib\site-packages\keras\engine\saving.py", line 83, in _serialize_model
    model_config['config'] = model.get_config()
  File "c:\apps\sa2446\lib\site-packages\keras\engine\network.py", line 931, in get_config
    return copy.deepcopy(config)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 215, in _deepcopy_list
    append(deepcopy(a, memo))
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 220, in _deepcopy_tuple
    y = [deepcopy(a, memo) for a in x]
  File "c:\apps\sa2446\lib\copy.py", line 220, in <listcomp>
    y = [deepcopy(a, memo) for a in x]
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 220, in _deepcopy_tuple
    y = [deepcopy(a, memo) for a in x]
  File "c:\apps\sa2446\lib\copy.py", line 220, in <listcomp>
    y = [deepcopy(a, memo) for a in x]
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 215, in _deepcopy_list
    append(deepcopy(a, memo))
  File "c:\apps\sa2446\lib\copy.py", line 180, in deepcopy
    y = _reconstruct(x, memo, *rv)
  File "c:\apps\sa2446\lib\copy.py", line 280, in _reconstruct
    state = deepcopy(state, memo)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "c:\apps\sa2446\lib\copy.py", line 180, in deepcopy
    y = _reconstruct(x, memo, *rv)
  File "c:\apps\sa2446\lib\copy.py", line 280, in _reconstruct
    state = deepcopy(state, memo)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "c:\apps\sa2446\lib\copy.py", line 180, in deepcopy
    y = _reconstruct(x, memo, *rv)
  File "c:\apps\sa2446\lib\copy.py", line 280, in _reconstruct
    state = deepcopy(state, memo)
  File "c:\apps\sa2446\lib\copy.py", line 150, in deepcopy
    y = copier(x, memo)
  File "c:\apps\sa2446\lib\copy.py", line 240, in _deepcopy_dict
    y[deepcopy(key, memo)] = deepcopy(value, memo)
  File "c:\apps\sa2446\lib\copy.py", line 169, in deepcopy
    rv = reductor(4)
TypeError: can't pickle _thread.RLock objects

2019-10-14 13:41:03 INFO     custom_policy  - Fitting model with 21634 total samples and a validation split of 0.2
Epoch 1/15
7360/7360 [==============================] - 1036s 141ms/step - loss: 3.3445 - acc: 0.8401
Epoch 2/15
7360/7360 [==============================] - 914s 124ms/step - loss: 2.0826 - acc: 0.8624
Epoch 3/15
7360/7360 [==============================] - 903s 123ms/step - loss: 1.2239 - acc: 0.8624
Epoch 4/15
7360/7360 [==============================] - 1030s 140ms/step - loss: 0.8736 - acc: 0.8627
Epoch 5/15
7360/7360 [==============================] - 1098s 149ms/step - loss: 0.7806 - acc: 0.8628
Epoch 6/15
7360/7360 [==============================] - 1046s 142ms/step - loss: 0.7417 - acc: 0.8628
Epoch 7/15
7360/7360 [==============================] - 1024s 139ms/step - loss: 0.7115 - acc: 0.8640
Epoch 8/15
7360/7360 [==============================] - 1016s 138ms/step - loss: 0.6844 - acc: 0.8658
Epoch 9/15
7360/7360 [==============================] - 964s 131ms/step - loss: 0.6591 - acc: 0.8682
Epoch 10/15
7360/7360 [==============================] - 847s 115ms/step - loss: 0.6353 - acc: 0.8717
Epoch 11/15
7360/7360 [==============================] - 847s 115ms/step - loss: 0.6129 - acc: 0.8727
Epoch 12/15
7360/7360 [==============================] - 849s 115ms/step - loss: 0.5919 - acc: 0.8727
Epoch 13/15
7360/7360 [==============================] - 847s 115ms/step - loss: 0.5722 - acc: 0.8731
Epoch 14/15
7360/7360 [==============================] - 847s 115ms/step - loss: 0.5538 - acc: 0.8774
Epoch 15/15
7360/7360 [==============================] - 846s 115ms/step - loss: 0.5367 - acc: 0.8797
2019-10-14 17:36:15 INFO     custom_policy  - Done fitting keras policy model
Processed trackers: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1303/1303 [00:10<00:00, 120.78it/s, # actions=13820]
Processed actions: 13820it [00:02, 6663.99it/s, # examples=12334]

Can I get information to solve this issue?

this sounds like a keras-specific issue rather than anything to do with Rasa.

Thanks for the Information! It’s obviously keras-specific issue. I have solved by using closure to wrap my Lambda function as specified here (tensorflow - Checkpointing keras model: TypeError: can't pickle _thread.lock objects - Stack Overflow)

1 Like