Syntax Sunday: Custom API Wrapper for GPTs
Last updated
Last updated
This week for Syntax Sunday, we'll focus on creating an API wrapper to transform data from the National Hockey League (NHL) Web API game-log endpoint:
We will also update the Hockey Stats and Analysis Expert GPT to use this new API endpoint for returning game logs instead of the one provided by the NHL Web API.
I previously wrote a Syntax Sunday article on GPT Actions, which explains how to connect external data sources to OpenAI's GPTs to use with Actions. You can read the article here...
I also created a GitHub Repository with all of the required code and instructions to recreate the Hockey Stats and Analysis Expert GPT.
One limitation I, along with other users encountered is that certain NHL API endpoint responses generate a large amount of data.
GPT-4 currently struggles to handle large API responses, often generating inaccurate values for statistics like shots, points, and games.
At times, it even fails to fully analyze the returned data and resorts to using placeholder text.
The issue was consistently noted when returning game log information for players.
As we are roughly 50 games into the season, a lot of data is returned and some of it is completely unnecessary.
The NHL API is not well-documented, and I couldn't find a way to refine the query using parameters. Therefore, I opted to create a custom API Wrapper instead. You know, for fun!
By using an API Wrapper for the NHL Web API, we gain the ability to effectively modify and refine the data retrieved from the game-log endpoint.
The custom API acts as a middle layer, allowing us to adjust and customize the initial request data according to specific needs and preferences.
GPT-4 creates and calls GPT Actions using the OpenAPI schema. To enhance its query capabilities, I created an API Wrapper with additional query parameters to manipulate the NHL API game log endpoint.
This allows for more specific queries when GPT-4 is responding to questions and ideally smaller API response objects for these queries. While this wrapper can be expanded to other endpoints, I focused specifically on the game-logs for now.
1.) Create an API Project: I created a NodeJS project using ExpressJS, swagger-ui express, and swagger-jsdoc.
Swagger will be used to create OpenAPI specification needed for the GPT Action.
I called this project: NHL-Stats-and-Analysis-Expert-API. You can view the repo and full source code here...
You can use which/whatever backend framework or language you prefer. NodeJS is just the quickest and easiest to deploy for me!
2.) Create a Game Log Route: I created a GET request route for: /game-log and created a function named getGameLog to handle the request.
This request requires the parameters of: playerId, seasonId, and gameTypeId as they are needed when make the request to the NHL API.
This is also where you are able to customize your API query and add your own parameters.
I added properties, limit, isAggregate, and isAscending as parameters. These will be used to transform the data from the NHL API response into more relevant response objects for the Hockey Stats and Analysis Expert GPT.
3.) Call the NHL API: using the parameter values from the custom API route, we call the original NHL Web API game-log endpoint within our API.
This is the data that would normally be returned to the GPT Action.
Instead we keep it and pass it on to the next function...
4.) Transform the Data: once returned, the data is passed to the transformGameLog function. It utilizes the remaining query parameters - properties, limit, isAggregate, and isAscending - to modify the data based on their values.
By providing these extra query parameters it gives the GPT more options when creating the query, resulting in a more specialized response.
This step is very important as this is how the original data is manipulated into a more precise response!
5.) Create Swagger Docs: I used swagger-ui-express and swagger-jsdoc to help create and manage the OpenAPI documentation. Swagger creates a nice UI so you can easily see and test your endpoints.
Try it here: https://bloodlinealpha.com/nhl-GPT/api-docs
Swagger can be used with the OpenAI GPT builder to auto-import the OpenAPI schema.
swagger-jsdoc allows you to use the jsDoc format to create your OpenAPI schema using YAML or JSON. I opted to use JSON as I prefer the formatting.
I did create a separate folder (routes/swagger/), for the OpenAPI JSON files. I find it easier to edit in a separate JSON file, as the jsDoc format requires you place it in a comment block.
6.) Publish the NodeJS App: Many hosting providers support NodeJS app hosting. Ensure they support the right versions to avoid compatibility issues with packages and other elements, especially when using older versions. Just find one that works for you!
Now that the API Wrapper is up and running, the GPT Actions need to be updated! You can see the new available query parameters above.
Swagger excels by offering a standardized approach to managing OpenAPI schemas, making it easier to build and connect web services.
If you head to: https://bloodlinealpha.com/nhl-GPT/api-docs.json you will see that this can be used within the GPT builder to automatically import the schema...
There may be a few errors within the schema editor, they can be resolved easily by removing:
and
With a few minor adjustments it is ready to go!
The GPT can now provide improved results when addressing questions by integrating the new query parameters from the API Wrapper.
Here is an example of the new query parameters being using by the Actions. You can view the chat here...
Despite the limitations of the NHL API, it is clear we can enhance the performance of the Hockey Stats and Analysis Expert GPT by using an API Wrapper to modify the original request data.
Overall, I like this approach because it works well and is applicable to a wide variety of use cases.
Currently, dumping a large amount of data into a GPT and expecting an accurate or valid response is not practical. Most of the time, it will produce hallucinated and assumptive outputs, especially when it comes to numbers.
A big problem is that some API's (paid or public) are not customizable or specific enough to be used with GPTs.
Creating and using API wrappers allows you to define your own queries and parameters, so you return only the data your GPT needs.
For GPT's that rely on a paid or public API's, this approach improves responses by allowing access to more specific queries.
There are some downsides to this approach too:
API Maintenance: You'll need to maintain and ensure the proper functioning of your own API service, which can add both time and costs.
API Changes: API providers regularly update their free and paid APIs, making changes to endpoints, data types, etc... . It's important to stay updated on these changes and conduct thorough testing.
Links to the GitHub Repo for this API Wrapper and all future examples will be on: https://bloodlinealpha.com/
If you have any you have any questions about API's or GPTs send me a LinkedIn message or email me at: bloodlinealpha@gmail.com.
Syntax Sunday
KH