
Problem Statement
There are many questions a user wishes to ask that ChatGPT may be unable to answer. Many of these questions have to do with real-time scenarios or scenarios that are later than the available data on which the model was trained. Examples of such questions are:
- What is the current stock price of Apple ?
- What is the current temperature in Beijing, China
- Who is the current English Premier League winner ?
If we submit the 1st question to Chat GPT, we would get something like the following response:
“I’m sorry, as an AI language model, I don’t have access to real-time data. However, as of my training data, the current price of GOOG (Google, the stock symbol for Alphabet Inc) varies on a daily basis, and you can search for the current price on financial websites like Yahoo Finance, Google Finance, or NASDAQ.”
To help us solve this problem, Open AI has released a new functionality : Function Calling as a result.
Function Calling Explained
Function calling is a new feature in the Chat Completions API. This feature is only available in these 2 models: gpt-4-0613
and gpt-3.5-turbo-0613
. It allows the user to describe a function to Chat GPT and then ask a natural language question that pertains to the function just described. The model will then respond with json that specifies the parameters or arguments to the function that was just described to it that when run will produce an answer to the question asked (assuming the function was relevant to the question asked).
An brief summary of how this works is as follows:
User describes the function getLatestStockPrice(..)
to ChatGPT
User then poses the question. What is the current stock price of Apple ?
Chat GPT responds with json stating that the function should be called as getLatestStockPrice("Apple")
Note that the API doesn’t call the function itself, instead it generates JSON output that the user can then use to call the function in her code.
The GPT response is parsed to extract the arguments and the function is called The client code can then parse the GPT response for the arguments to the function and call the function, returning the answer to the user.
Demonstration
Here is code that demonstrates what I have described above.
We will be using some form of Jupyter notebook to run the code. In my specific case, I used Google’s Colab for convenience.
First we install the necessary libraries:
!pip install openai finnhub-python
The first module above is openai and the second is a module for calling FinnHub, an API we can use to obtain free real-time stock price quotes.
We can now make a call to the Chat-GPT api asking it for the current stock price of say, Google:
stock_prx_response = openai.ChatCompletion.create(
model="gpt-3.5",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the current price of Google"},
]
)
We then get the following response:
stock_prx_response
stock_prx_response
<OpenAIObject chat.completion id=chatcmpl-7U9T8j1XQabN9jsRDmLV4VySEGXu4 at 0x7f7d4925b100> JSON: {
"id": "chatcmpl-7U9T8j1XQabN9jsRDmLV4VySEGXu4",
"object": "chat.completion",
"created": 1687420594,
"model": "gpt-3.5-turbo-0301",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": "I'm sorry, as an AI language model, I don't have access to real-time data. However, as of my training data, the current price of GOOG (Google, the stock symbol for Alphabet Inc) varies on a daily basis, and you can search for the current price on financial websites like Yahoo Finance, Google Finance, or NASDAQ."
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 27,
"completion_tokens": 72,
"total_tokens": 99
}
}
As we can see from the response content ChatGPT is unable to answer the question as it can’t query real-time data.
So let’s see how Function Calling can help us.
First, we obtain our API keys from the GCP secrets store:
from google.colab import auth
auth.authenticate_user()
project_id = "999999999999"
def get_secret_from_gcp(secret_name, project_id):
client = secretmanager.SecretManagerServiceClient()
resource_name = f"projects/{project_id}/secrets/{secret_name}/versions/latest"
# Get OpenAI key as secret from GCP
response = client.access_secret_version(name=resource_name)
secret = response.payload.data.decode('utf-8')
return secret
openai.api_key = get_secret_from_gcp("OPENAI_API_KEY")
FINNHUB_API_KEY = get_secret_from_gcp("FINNHUB_API_KEY")
See this blog post Secrets Access in Google colab notebooks for details on how to do this.
Next, we cook up a function that we can use to obtain the latest stock price:
import openai
import os
import finnhub
def get_latest_stock_price(stock_symbol):
finnhub_client = finnhub.Client(api_key=FINNHUB_API_KEY)
px_key='c'
latest_prx = finnhub_client.quote(stock_symbol)[px_key]
return latest_prx
Next, we write a block of code that we can use to describe this function to Chat-GPT:
get_stock_price_func = {
"name": "get_latest_stock_price",
"description" : "Get latest stock price for symbol",
"parameters" : {
"type":"object",
"properties": {
"stock_symbol" : {
"type": "string",
"description" : "The stock symbol to look up latest price for"
}
},
"required" : ["stock_symbol"]
}
}
We now make a call to the ChatCompletion API and pass the description
prompt="Get latest stock price for Google"
response = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[
{"role": "user", "content": prompt},
],
functions=[get_stock_price_func],
function_call="auto"
)
Note the difference between this call to the ChatCompletion API vs the earlier one.
The differences are:
- Use of a newer model
gpt-3.5-turbo-0613
vsgpt-3.5-turbo
- Two additional sections –
functions
andfunction_call
The newer models gpt-3.5-turbo-0613
and gpt-4-0613
provide function calling capability and must be specified.
The functions
section is set to a list of function descriptions.
The function_call
section is set to auto
specifying that the API should try to match a function to be called and generate the arguments that would be needed.
Possible values are:
auto
– default, stating thatnone
– prevent the model from calling any functions.{"name": "get_latest_stock_price"}
– here we explicitly state the function that should be called.
ChatCompletion API Response
The call to the ChatCompletion API returns a response that looks like the following:
<OpenAIObject chat.completion id=chatcmpl-7pvGZwmv0SCBG2X5jY2Bhcg5hTTXv at 0x7f4d651bae30> JSON: {
"id": "chatcmpl-7pvGZwmv0SCBG2X5jY2Bhcg5hTTXv",
"object": "chat.completion",
"created": 1692609215,
"model": "gpt-3.5-turbo-0613",
"choices": [
{
"index": 0,
"message": {
"role": "assistant",
"content": null,
"function_call": {
"name": "get_latest_stock_price",
"arguments": "{\n\"stock_symbol\": \"GOOG\"\n}"
}
},
"finish_reason": "stop"
}
],
"usage": {
"prompt_tokens": 75,
"completion_tokens": 10,
"total_tokens": 85
}
}
Note the function_call section which specifies the function to be called as well as the arguments to pass to it. This is the response for either the “auto” or named function options passed in to function_call
When “none” is passed there is no function_call
section in the response.
We can parse this response and obtain the function to call along with recommended parameters:
output = response.choices[0].message
print(output)
Next we call our function with its parameters:
import json
# extract function
my_func = eval(output.function_call.name)
params = json.loads(output.function_call.arguments)
# call function, unpacking params
stock_price = my_func(**params)
print(f"Stock price = {stock_price}")
Now lets add the function output back to the message and make a 2nd API call:
prompt="Get latest stock price for GOOG"
completion2 = openai.ChatCompletion.create(
model="gpt-3.5-turbo-0613",
messages=[
{"role": "user", "content": prompt},
{"role": "function",
"name": output.function_call.name,
"content": str(stock_price)},
],
functions=[get_stock_price_func],
)
response2 = completion2.choices[0].message.content
print(response2)
This produces the following response:
The latest stock price for GOOG (Google) is $130.69.
Conclusions
From an unstructured user input question, we can use function calling to convert it a structured function call with parameters that can be executed to answer the question.
This is the power of function calling. It enables the app designer to do the following :
- Enable the model to answer questions about events occurring even later than the corpus of data it was trained on.
- Convert an unstructured human language user input question to a structured function call with parameters that can be executed to answer the question.
So from the example above we should note the following:
- We can use function calling as a means to enhance the functionality of ChatGPT to provide real-time results such as getting stock prices, sports scores or weather reports.
- We cannot use OpenAI to actually call our functions. Instead OpenAI identifies the function & parameters which should be used. It’s still our code’s responsibility to execute the function.
Useful Links
Check out the helpful links below: