----------------------
Yes, you can use async methods with chat as well. The
achat
and
astream_chat
methods are used for async function calling in a chat context. Here is an example of how you can use these methods:
from llama_index.llms.azure_openai import AzureOpenAI
from llama_index.core.llms import ChatMessage
llm = AzureOpenAI(model="YOUR_AZURE_OPENAI_COMPLETION_MODEL_NAME",
deployment_name="YOUR_AZURE_OPENAI_COMPLETION_DEPLOYMENT_NAME",
api_key="YOUR_AZURE_OPENAI_API_KEY",
azure_endpoint="YOUR_AZURE_OPENAI_ENDPOINT",
api_version="2023-07-01-preview")
messages = [
ChatMessage(role="system", content="You are a helpful assistant."),
ChatMessage(role="user", content="Who won the world series?"),
]
resp = await llm.achat(messages)
print(resp)
resp = await llm.astream_chat(messages)
async for delta in resp:
print(delta.delta, end="")
Please replace the placeholders with your actual values.
You can find more details in the
LlamaIndex documentation ().
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------