Any cookbook/recipie close to this inventory bot I have in mind

Hello, My stuff, digital and physical are a mess. I somehow found a way to mess up things again every time I clean up. This time I want to build something that forces me to document out loud, what it is where is it. It’s also scalable and thus more people can collaborate to the inventory.

  • :jigsaw: Place: wake keyword, Place 4 blue ink pens in locationX.
  • :paperclip: Note: wake keyword, Note yellow earplugs are used for noise reduction.
  • :gift: Give: keyword**, Give 19.6v charger to Perto.
  • :mag: Find: Wake keyword, Find the 19.6v charger?
  • :card_index_dividers: Relate: Wake keyword, schematics book is part of the fix the laptop project.

Then the server Rasa runs on would use txtai for semantic search to build a pipleline for my digital stuff; bookmarks, PDFs, etc… This builds a database of all my digital and physical stuff and finds relations between them. Is there any example project I could follow to build this? Or can you point me to the NLU documentation specifically to extract 4 blue ink pens and add to a database for example. I plan on learning by documenting my progress, and of course this will be opensource. Any help appreciated, thanks

1 Like

Welcome to the community, Naim :slight_smile:

This is a very interesting use-case! But your problem is that entities (e.g. “19.6V charger”) are not defined in a list or have a specific pattern. It could literally be anything.

So a more suitable bot for you would be a rule-based bot, not a machine learning bot like Rasa. Of course, you could do that with Rasa, using payloads, but this completely removes the AI part.


If you want to use Rasa, here is what you can do:

1. Transform messages to payloads

What you could do is transform your messages to payloads from your front-end application.

For example: Find the 19.6V charger/find{"object": "19.6V charger"}.

I think this would be pretty easy to do. You could make a dropdown list with the keywords, and an input textbox for the object name.

2. Send it to Rasa

Now, send this message from your frontend to Rasa using the Rasa REST API.

// POST http://<host>:<port>/webhooks/rest/webhook
{
  "sender": "Naim",
  "message": "/find{\"object\": \"19.6V charger\"}"
}

3. Define rules for the intents

You should do that step while coding the bot, so technically that should be before the other two, but I’m going through the process in the same order as the bot.

Define rules that link every intent to their specific custom action (more on that later).

Of course, you could send all intents to the same action and handle the logic there since it would be easy to do so. Usually, you should not do that, but it’s kinda okay in your case. Still wouldn’t recommend it.

So build rules like so:

rules:

- rule: Place
  steps:
  - intent: place
  - action: action_place

- rule: Note
  steps:
  - intent: note
  - action: action_note

etc.

4. Query the database

The actions (action_place, action_note, etc.) are custom actions, which are regular Python code.

In a custom action, you can access the tracker to check info about the conversation and latest message, such as the entities.

Once you are in a custom action (e.g. action_find), make sure you have the correct entities present:

class Find(Action):
    def name(self):
        return "action_find"

    async def run(self, dispatcher, tracker, domain):
        obj = next(tracker.get_latest_entity_values("object"), None)
        if obj is None or obj.strip() == '':
            # handle if the "object" entity is not present
            dispatcher.utter_message('Please say what you want to find')
        else:
            # write your own code to find the object in your database or however txtai works
            response = find(obj)
            dispatcher.utter_message(f'The location of "{obj}" is: {response}')

        return []
1 Like

Hi Chris, nice to see someone else from Beirut here. :smiling_face_with_three_hearts: There’s a pattern of course. For every command, the I’ll restrict my language to the corresponding pattern. Take a closer look at the attached diagram, but I’ll type it out so it’s clearer :nerd_face:

  • Wake keyword, Place {number} of {description} {Item} at {location}
  • Wake keyword, Bundle {{number} of {item}} and {{number} of {item}} at {location}
  • Wake keyword, Note {item} is used for {description}
  • Wake keyword, Give {item} to {user} {note}
  • Wake keyword, Find {description} {item}
  • Wake keyword, Relate description} {item} to {topic} I think if I just get some help with the Place command, that should get me started to figure out the rest on own.

I was originally inspired by this instructable. Demo’ed here But I don’t want to restrict it to a cabinet on one wall. I don’t know how yet, but I want to come up with some addressing scheme or a coordinate system that covers the whole apartment.

Also, the FindyBot 3000 project uses Google Assistant + IFTTT + Azure Function. Other than privacy concerns, that would only work for whatever current supported standardized languages, and there’s not much to learn going down that route. Other than organizing and taking a detailed inventory of my stuff, this project somehow would seed question/answer content for my language acquisition project (a much bigger thing I’ve been working on). I’m not sure how yet, as I’m just getting into NLP and chatbots. Enter Rasa :smiling_face_with_three_hearts:

1 Like

You’re right though, in step 26 of the instructable, he does create some sort of seed item database.

There are two relevant tables, Items and Tags. Items stores the item name, quantity, row, column, and other such info for the items themselves, including a unique identifier NameKey. Tags stores the associated tags for a given NameKey, with the requirement thatall tags must be a singularized lower-case word.

The NameKey column must be unique within the Items table, and must exist in the Itemstable before adding entries to the Tags table. NameKey should be lowercase and singular. For example, to insert 24 Orange LEDs, the NameKey would be ‘orange led’. Relevant tags for ‘orange led’ may include: orange, led, light, diode, etc. All lower case and singular. Each tag will take a new row in the Tags table, all with the same NameKey of ‘orange led’.

I don’t know what the equivalent would be for Rasa.

Oh hi! Nice to have you here :smiley:

Oh no, I meant the pattern for the entity values, not the whole sentence - this is what the problem is.

number and location have a pattern or a finite definition, but things like description, note, and item could be anything!

Usually, you define entities with Regex if there is a pattern, or with Lookup Tables if there’s a finite amount of possible values (even it’s a million).

Now you don’t need to use either of those - I was succesful in training a bot into recognizing locations without using a lookup table, but I still had to give a lot of data (see here) and it did not work for names that were different from what the training data had.

Anyway, if the pattern is that structured, I would do it from your frontend application or another type of AI/bot, then send a payload to Rasa to execute an action.

Rasa is made for more unstructured data: it’s a conversational chatbot platform, which helps in regular conversations by detecting the intent behind a message, responding appropriately, and keeping track of the conversation’s context.

But of course, you can use it for your project! But again, I recommend integrating it with other platforms which will make your specific task easier to implement :slight_smile:

There are two main types of built-in storage for Rasa:

  • Entities, which are short-term memory and get erased after each message
  • Slots, which are long-term memory and gets either autofilled from entities, mappings, or custom actions

But both of these are used to keep relatively small information, they are not to be used as databases. What I would suggest is using custom actions to query a third-party database with your items in it. It could be MySQL, MongoDB, a JSON file, Firebase, or anything really.