- Published on
Connect LLMs to APIs with Superface Hub API
The ability to add additional tools that extend the capabilities of Large Language Models is increasing all the time. To help developers save time and focus on their agent experience, we've created the Superface Hub API to quickly add tools and their associated function descriptions to LLM-powered applications.
If you'd like to jump straight in, take a look at the Hub API documentation and sign up for a free Superface account.
What is LLM function calling?
Large Language Models only know what they were trained on. It's a tremendous amount of information that is updated with every new model release. However, they don't know about you (we hope), or your business, or your sales from last quarter, or how to update customer data in your cloud based CRM.
For example, OpenAI's GPT-4 can't tell you how many qualified leads are in your pipeline because it doesn't have access to that data. Still, it could if it knew that there was a tool available to gather that information from an API.
This is where function calling steps up.
APIs for many models now support function calling, a way to instruct an LLM about a particular tool and the values it needs to use to work with it.
How function calling works
To work with tools, LLMs need to understand their capabilities and the information needed to use them. With this information, a model can determine that a particular tool would be the best route to answering a user prompt and choose to use it.
These tools and their capabilities are defined in a "function description" that is typically represented in JSON format. Below is an example of a tool that would get the current weather:
[
{
"type": "function",
"function": {
"name": "weather__current-weather__CurrentWeather",
"description": "Retrieve current weather information for a specified location.\n",
"parameters": {
"type": "object",
"required": ["city"],
"properties": {
"city": {
"type": "string",
"nullable": false,
"description": "Name of the city including state and country, e.g.: \"Prague, Czech Republic\" or \"New York City, NY, USA\"",
"title": "city"
},
"units": {
"enum": ["C", "F", "K"],
"description": "Units used to represent temperature - Fahrenheit, Celsius, Kelvin\nCelsius by default",
"title": "units"
}
},
"nullable": true
}
}
}
]
This format is based on the Open API JSON schema. However, not all models use this exact format, as we explained in this LLM Function Calling article.
These function descriptions are sent to the model along with an initial prompt—almost like saying, "Here's the prompt, but if you don't know how to respond to this, look at this list of tools and choose one that will help, then tell the application that you want to run it."
Function descriptions are for information purposes only as far as the model is concerned.
LLMs don't directly make API calls based on these function descriptions (yet). An application is required to handle this. The image below demonstrates how the communication between the user, application, and model typically works.
What is required to use function calling?
To implement function calling in your LLM-powered agent or application, you need the following:
- A description of one or more "tools" that your model can choose to use
- Logic in your application to handle API calls required to get the response the model requires
Depending on your agent/application use case, you might also need the following:
- A way of managing authentication for users so they can provide their own credentials for tools that the model chooses.
Superface Hub API
If you like the idea of working with function calling in your application or agent but want to save time on building function descriptors for tools and more functions to work with their APIs, Superface's Hub API was built with you in mind.
The Hub API provides:
- Function descriptions for any tools that have been added to a Superface account.
- A way of securely authenticating users so they can provide their own credentials for tools.
- A single endpoint to execute the API calls for any tool the model wants to use.
Overall, this approach helps developers bring broad tool functionality to the agent/application with low code overhead and a minimal level of management.
The function descriptions Superface provides use the Open API JSON schema and can be used by OpenAI, MistralAI, Anthropic, and LangChain with more on the way.
For more information on how to implement this in your application, check out our documentation, or drop us an email at support@superface.ai, and we'll gladly tell you more.
Tools without limits
Superface has a broad list of tools that you can use immediately, such as Google Suite, HubSpot, Salesforce, Todoist, Slack, Notion, and many more. You can also build and manage tools and use Superface to make them available for LLM Function Calling.
If you have an Open API Specification for the API you want to use, you can build a new tool for Superface. To learn how take a look at our Creating Tools documentation.