Functions Calling With OpenAI’s ChatGPT API Using Python

Hello Learners…

Welcome to the blog…

Table Of Contents

  • Introduction
  • Functions Calling With OpenAI’s ChatGPT API Using Python
  • Summary
  • References

Introduction

In this post, we discuss Functions Calling With OpenAI API Using Python. where when doing a conversation with ChatGPT we can call our own functions.

Functions Calling With OpenAI’s ChatGPT API Using Python

When using OpenAI’s ChatGPT API with Python, we can integrate our own functions by defining them within our Python code.

What is Function Calling?

The function call is a new way to connect GPT’s capabilities with external tools and APIs more reliably.

Now we can describe the functions to the OpenAI’s models gpt-4-0613 and gpt-3.5-turbo-0613 after that the model intelligently chooses to output a JSON object containing arguments to call those functions.

How We Can Do Function Calling With OpenAI’s Chat Models Using Python.

These two models currently we can use for the function calling with OpenAI API using Python.

  • gpt-4-0613
  • gpt-3.5-turbo-0613

GPT-4 is not accessible to everyone right now so we use the gpt-3.5-turbo-0613 model for our learning purpose.

So let’s start…

Environment Setup

To implement functional calling using OpenAI API, first, we require an OpenAI API Key, for that you can go below URL and buy a key.

NOTE: You don’t need to buy an OpenAI API key for the personal use cases if you have any further plans then you can buy it.

Now we have to set our OpenAI API key in the .env file, OpenAI recommends it,

So create a .env file and put the API key as below,

Now we have to install the required Python libraries, here we have to install openai for the OpenAI model call and python-dotenv for storing and loading our API key from the .env file.

pip install openai python-dotenv 

In this, we are going to call the below function using OpenAI API,

def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    weather_info = {
        "location": location,
        "temperature": "39", # set by us
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }
    return json.dumps(weather_info)

This is the simple function that we used for just understanding the function calling using openai models API you can use any you want.

In this function, we set ‘temperature’ values as ’39’ which we get in our response.

After installation of the required Python libraries, we create an app.py file and paste the below code into this file,

Here’s an example of how we can call your own functions while interacting with the ChatGPT API:

import os
import openai
import json 

from dotenv import load_dotenv,find_dotenv
__ = load_dotenv(find_dotenv()) #read local .env file
openai.api_key=os.environ["OPENAI_API_KEY"]


def get_current_weather(location, unit="fahrenheit"):
    """Get the current weather in a given location"""
    weather_info = {
        "location": location,
        "temperature": "39", # set by us
        "unit": unit,
        "forecast": ["sunny", "windy"],
    }
    return json.dumps(weather_info)
    


def first_response(user_text):
    completion = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=[
       
        {"role": "user", "content": user_text}
    ],
    functions = [
        {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. bangalore, Delhi",
                    },
                    "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
                },
                "required": ["location"],
            },
        }
    ]
    )
    result=completion.choices[0]['message']
    print("Answer>>>>>>",result)
    return result

def second_response(first_res_data,function_args):
    completion = openai.ChatCompletion.create(
    model="gpt-3.5-turbo-0613",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        first_res_data,
       {
                "role": "function",
                "name": first_res_data["function_call"]["name"],
                "content": get_current_weather(
            location=function_args.get("location"),
            unit=function_args.get("unit"),),
        }

    ],
    functions = [
        {
            "name": "get_current_weather",
            "description": "Get the current weather in a given location",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. Gujarat, delhi",
                    },
                    
                },
                "required": ["location"],
            },
        }
    ]
    )
    result2=completion.choices[0]['message']
    print("Answer>>>>>>",result2)
    return result2



if __name__=='__main__':
    question="what is the temprature in mumbai"
    first_response_data=first_response(question)
    function_args = json.loads(first_response_data["function_call"]["arguments"])
    num_addition=second_response(first_response_data,function_args)

now we run the app.py file, and see the response,

In this code, we asked a question,

question="what is the temprature in mumbai"

The final response of the above code is,

Answer>>>>>> {
  "role": "assistant",
  "content": null,
  "function_call": {
    "name": "get_current_weather",
    "arguments": "{\n  \"location\": \"mumbai\"\n}"
  }
}
Answer>>>>>> {
  "role": "assistant",
  "content": "The current weather in Mumbai is 39 degrees Celsius with a forecast of sunny and windy."
}

this response that we can see in our terminal windows.

So here we just called a simple function with OpenAI Chat Models, here we can use any other functions or API and get the response based on that.

You can download full code from below github URL:

Summary

Also, you can read,

Happy Learning And Keep Learning…

References

Leave a Comment