Generating MQL Shell Commands Using OpenAI and New mongosh Shell
Rate this article
MongoDB’s Query Language (MQL) is an intuitive language for developers to interact with MongoDB Documents. For this reason, I wanted to put OpenAI to the test of quickly learning the MongoDB language and using its overall knowledge to build queries from simple sentences. The results were more than satisfying to me. Github is already working on a project called
which uses the same OpenAI engine to code.
If you want to use OpenAI, you will need to get a trial API key first by joining the
page. Once you are approved to get an API key, you will be granted about $18 for three months of testing. Each call in OpenAI is
and this is something to consider when using in production. For our purposes, $18 is more than enough to test the most expensive engine named “davinci.”
I’ve decided to use the Questions and Answers pattern, in the form of
A: <Answer>, provided to the
completion API to provide the learning material about MongoDB queries for the AI engine. To better feed it, I placed the training questions and answers in a file called
AI-input.txtand its content:
We will use this file later in our code.
This way, the completion will be based on a similar pattern.
Use the copied connection string, providing it to the
mongoshbinary to connect to the pre-populated Atlas cluster with sample data. Then, switch to
Now, we can build our
textToMqlfunction by pasting it into the
mongosh. The function will receive a text sentence, use our generated OpenAI API key, and will try to return the best MQL command for it:
In the above function, we first load the OpenAI npm module and initiate a client with the relevant API key from OpenAI.
Then, we read the learning data from our
AI-input.txtfile. Finally we add our
Q: <query>input to the end followed by the
A:value which tells the engine we expect an answer based on the provided learningPath and our query.
This data will go over to an OpenAI API call:
The call performs a completion API and gets the entire initial text as a
promptand receives some additional parameters, which I will elaborate on:
engine: OpenAI supports a few AI engines which differ in quality and purpose as a tradeoff for pricing. The “davinci” engine is the most sophisticated one, according to OpenAI, and therefore is the most expensive one in terms of billing consumption.
temperature: How creative will the AI be compared to the input we gave it? It can be between 0-1. 0.3 felt like a down-to-earth value, but you can play with it.
Max_tokens: Describes the amount of data that will be returned.
Stop: List of characters that will stop the engine from producing further content. Since we need to produce MQL statements, it will be one line based and “\n” is a stop character.
Once the content is returned, we parse the returned JSON and print it with
Once we have our function in place, we can try to produce a simple query to test it:
Let's do something more creative.
Now that is the AI power of MongoDB pipelines!
MongoDB's new shell allows us to script with enormous power like never before by utilizing npm external packages. Together with the power of OpenAI sophisticated AI patterns, we were able to teach the shell how to prompt text to accurate complex MongoDB commands, and with further learning and tuning, we can probably get much better results.
Try this today using the new MongoDB shell.