Simplifying Openai Function Calling With Structured Output A 2024
OpenAI’s function calling feature, introduced earlier, empowers developers to create interactive applications by enabling language models to call functions. This feature allows models to return structured data that can trigger APIs, extract data, or perform tasks. Recently, OpenAI has enhanced this capability with Structured Outputs, simplifying the way developers interact with models. Now, instead of defining complex JSON schemas for each function, the new feature allows developers to define output structures using simpler SDK tools, making function calling more accessible and efficient. In this blog, we’ll compare our previous blog “OpenAI Function Calling With External API Examples” with the new structured output feature, demonstrating how this new approach significantly reduces complexity while improving functionality. Before starting with this blog, please generate API keys for Finnhub and Alpha Vantage from our previous blog.
Once, you have generated the APIs, you can start with the following steps: Install the required libraries using the below code: Next, define the necessary variables that hold the values for API keys and the OpenAI model name you want to use. After that, initialize the OpenAI and Finnhub clients for further processing. Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories. Structured outputs make a model follow a JSON Schema definition that you provide as part of your inference API call. This is in contrast to the older JSON mode feature, which guaranteed valid JSON would be generated, but was unable to ensure strict adherence to the supplied schema. Structured outputs are recommended for function calling, extracting structured data, and building complex multi-step workflows. You can use Pydantic to define object schemas in Python.
Depending on what version of the OpenAI and Pydantic libraries you're running you might need to upgrade to a newer version. These examples were tested against openai 1.42.0 and pydantic 2.8.2. If you are new to using Microsoft Entra ID for authentication see How to configure Azure OpenAI in Microsoft Foundry Models with Microsoft Entra ID authentication. Thanks for reading Cobus Greyling on LLMs, NLU, NLP, chatbots & voicebots! Subscribe for free to receive new posts and support my work. Previously structured output could be created via toggling the JSON mode on / off, or by making use of function calling.
Large Language Models (LLMs), much like conversational UIs in general, excel at handling unstructured data presented as natural language. This unstructured input is first organised and processed, then transformed back into natural language as a structured response. Previously two options were available JSON Mode & Function Calling… Enabling OpenAI’s JSON mode doesn’t ensure that the output will adhere to a specific predefined JSON schema. It only guarantees that the JSON will be valid and parse without errors. In the realm of AI-driven applications, ensuring consistent and predictable outputs is paramount.
OpenAI’s introduction of Structured Outputs addresses this need by allowing developers to define the exact format of the model’s responses, ensuring they adhere to specified schemas. Structured Outputs enable developers to constrain the model’s responses to a predefined structure, typically defined using JSON Schema. This ensures that the outputs are not only valid JSON but also match the expected format, reducing the need for post-processing and error handling. 🔧 Using pydanticmodel with text_format under the method called client.responses.parse Step 1: Define the pydantic model in models/document_extraction.py Step 2: Create a main.py and run the following code.
You will have the structured output in JSON format. OpenAI’s function calling feature, introduced earlier, empowers developers to create interactive applications by enabling language models to call functions. This feature allows models to return structured data that can trigger APIs, extract data, or perform tasks. Recently, OpenAI has enhanced this capability with Structured Outputs, simplifying the way developers interact with models. Now, instead of defining complex JSON schemas for each function, the new feature allows developers to define output structures using simpler SDK tools, making function calling more accessible and efficient. In this blog, we’ll compare our previous blog “OpenAI Function Calling With External API Examples” with the new structured output feature, demonstrating how this new approach significantly reduces complexity while improving functionality.
Before starting with this blog, please generate API keys for Finnhub and Alpha Vantage from our previous blog. Once, you have generated the APIs, you can start with the following steps: Install the required libraries using the below code: Next, define the necessary variables that hold the values for API keys and the OpenAI model name you want to use. After that, initialize the OpenAI and Finnhub clients for further processing. Structured Outputs is a new capability in the Chat Completions API and Assistants API that guarantees the model will always generate responses that adhere to your supplied JSON Schema.
In this cookbook, we will illustrate this capability with a few examples. Structured Outputs can be enabled by setting the parameter strict: true in an API call with either a defined response format or function definitions. Previously, the response_format parameter was only available to specify that the model should return a valid JSON. In addition to this, we are introducing a new way of specifying which JSON schema to follow. Function calling remains similar, but with the new parameter strict: true, you can now ensure that the schema provided for the functions is strictly followed.
People Also Search
- Simplifying OpenAI Function Calling with Structured Output: A 2024 Guide
- How to use structured outputs with Azure OpenAI in Microsoft Foundry ...
- Introducing Structured Outputs in the API - OpenAI
- How OpenAI implements Structured Outputs - newtuple.com
- OpenAI Enhanced Their API With Robust Structured Output Capabilities
- Getting Structured Outputs from OpenAI Models: A Developer's Guide
- Simplifying OpenAI Function Calling with Structured Output: A 2024 ...
- Introduction to Structured Outputs - OpenAI
- OpenAI Function Calling Tutorial: Generate Structured Output
- OpenAI Structured Outputs: How-To Guide for Developers
OpenAI’s Function Calling Feature, Introduced Earlier, Empowers Developers To Create
OpenAI’s function calling feature, introduced earlier, empowers developers to create interactive applications by enabling language models to call functions. This feature allows models to return structured data that can trigger APIs, extract data, or perform tasks. Recently, OpenAI has enhanced this capability with Structured Outputs, simplifying the way developers interact with models. Now, instea...
Once, You Have Generated The APIs, You Can Start With
Once, you have generated the APIs, you can start with the following steps: Install the required libraries using the below code: Next, define the necessary variables that hold the values for API keys and the OpenAI model name you want to use. After that, initialize the OpenAI and Finnhub clients for further processing. Access to this page requires authorization. You can try signing in or changing d...
Access To This Page Requires Authorization. You Can Try Changing
Access to this page requires authorization. You can try changing directories. Structured outputs make a model follow a JSON Schema definition that you provide as part of your inference API call. This is in contrast to the older JSON mode feature, which guaranteed valid JSON would be generated, but was unable to ensure strict adherence to the supplied schema. Structured outputs are recommended for ...
Depending On What Version Of The OpenAI And Pydantic Libraries
Depending on what version of the OpenAI and Pydantic libraries you're running you might need to upgrade to a newer version. These examples were tested against openai 1.42.0 and pydantic 2.8.2. If you are new to using Microsoft Entra ID for authentication see How to configure Azure OpenAI in Microsoft Foundry Models with Microsoft Entra ID authentication. Thanks for reading Cobus Greyling on LLMs, ...
Large Language Models (LLMs), Much Like Conversational UIs In General,
Large Language Models (LLMs), much like conversational UIs in general, excel at handling unstructured data presented as natural language. This unstructured input is first organised and processed, then transformed back into natural language as a structured response. Previously two options were available JSON Mode & Function Calling… Enabling OpenAI’s JSON mode doesn’t ensure that the output will ad...