The Support Group Blog

FileMaker 2025: AI Integration

Written by The Support Group | Jul 16, 2025 3:46:46 PM

Claris has gone ALL IN with AI integration on the FileMaker platform. There are a lot of new features to explore – so many that we could probably do a blog post a week for the next year going into all of them!  For today, we’re going to focus on the most basic of the AI tools in the FileMaker Pro app itself.   Some things to think about before we get started:

First off, much of this is new and changing rapidly. We are still learning about how all of this works - we are FileMaker experts, after all, not AI experts.  As with every new tool, if you decide to use it, know that there is a learning curve and it would make sense for you to educate yourself on some of the AI terms and how things work. 

Second, as a rule of thumb, make sure you’re cognizant of what you’re sending to and from the external LLMs. You can certainly create the ability, within your application, to communicate with an external LLM without allowing access to your FileMaker database or files.  But as you get more sophisticated with the use of these various tools, be aware of the security risks involved and be intentional about what you share.  If you get to a point where you are concerned about security, Claris has given us the ability to install and run an LLM locally.  And indeed, all of the script steps work exactly the same for a custom LLM as they do for an external model. You simply specify a different endpoint in the first configuration script step. 

With that out of the way, let’s dive in.  The two new script steps we are going to review are are:

Using just these two script steps, we’re going to build a basic chatbot within FileMaker that will allow the user to submit queries to ChatGPT and view the results.  Note that none of the FileMaker data from the application will be sent to the LLM and this is a very basic example to get you started. 

 

Getting an API Key

Before we get into FileMaker, you’re going to need to get an API key. For our example, we’re going to use the most commonly used option: OpenAI. The process is quite simple once you’ve created your account.  You’ll be prompted to create a name for the key; the name will only be used at this point to know what you’re using the key for.

Next you’ll be given the opportunity to copy the API key which will look like a long string of random numbers and letters.  It’s crucial to copy the key down, as you won’t be able to access it again once you close the dialog box where it’s generated.  You can certainly re-generate a new one if needed, but it’s a good habit to get into.  Here are some screenshots of this process:

This step should be the only one you have to take outside of FileMaker.  Next you'll set up the initial script step.

 

Configure AI Account

This script step is very simple, and only has three fields to fill in:  Account Name, Model Provider, and API Key.  

 
Account Name

This can be any text string that you choose.  It does not have to be the same as the API Key Name but it’s important that you pick something easy to remember because you’ll be referencing it in other script steps later in the process.

 
Model Provider

This is where you select which LLM you’re using.  You have four options: 

 

OpenAI, Anthropic, and Cohere are all external AI services.   The custom option is there for you to provide an endpoint to either a different external AI service or a locally hosted LLM.  We’re going to select OpenAI, which is the default option.

 

API Key

This is where you paste in the API key you copied from your provider.  It should look something like this when you’re done. 

 

Generate Response From Model

This is where we get into the real meat.  We’ve already set up our connection with the last script step.  Now we can submit a prompt and get a response. A blank version of this script step looks like this:

 

Account Name

This will be the Account Name you specified in the previous script step: Configure AI Account.  

 

Model

This can be a bit tricky because just about every commercially available LLM has a variety of models you can use.  You can see a list of them here and a list of the prices here. Two things to consider:

  1. You have to have access to these through your account and most of them require some sort of payment method to be entered.  They all have different payment structures, but in general, they’ll be charging you per so many tokens.  
  2. You have to type in the name of the model exactly as it is listed, case sensitive and all.  We are going to enter "gpt-4o". 

User Prompt

This is the prompt that you’ll be submitting to the LLM.  We are going to use a field that the user will be able to type into. 

Next, you’ll want to click on the gear icon to the right of the script step to fill in a few additional options.  

While there are a lot of parameters, we’re going to keep this example as simple as possible.  You’ll need to specify three things:  Response, Save Message History To, and Messages.  

*Note #1:  Agentic Mode is on by default, but not required for this particular scenario.  You’ll want to turn it off.  

*Note #2: For fun, you might want to toggle “Stream” on.  It’s not required, but what this will do is “stream” the results as they come in from the model, instead of displaying them after the result has been fully formed.  This also mimics the behavior of most AI chat bots, so users might be expecting to see it work that way. 

 

Response

Here you’ll want to specify a field or a variable to display or process the response from the model.  Again, we’re using a field that we can display to the user.   

 

Save Message History To

Now, we could stop here and the request would work.  But each request would be submitted to the LLM without any history of previous prompts. To allow the interaction to be a bit more interactive, with back and forth and follow up questions that will have the context of previous interactions, we’re going to save the message history to a global variable.  We’re going to set the COUNT to 10 - this is simply the count of how many messages you want to save. 

 

Messages

This is a parameter to tell the LLM where we are storing those messages, so we’re going to set it to reference the global variable we just set. 

Voila! The overall script step should look something like this when you’re done. 

 

You should now be able to submit a prompt to OpenAI, get a response, and have the last ten responses be saved as context for further prompts.  This is just the tip of the iceberg in terms of the AI features, and what you can do with them.

We created a sample FileMaker file to make your life easier to enter in the details of your specific API key, Model, and Account name.  Please contact us to request the file!  As always, our team of certified FileMaker developers are here to help if you are interested in upgrading to FileMaker 2025 and implementing these groundbreaking AI tools.