Find answers from the community

Updated 3 months ago

Custom

Hi, I am looking for a way for the same.
L
w
2 comments
@woojim just wrap the api into a custom llm object.

If the api supports message roles, you can also implement the chat/stream_chat functions

https://gpt-index.readthedocs.io/en/stable/core_modules/model_modules/llms/usage_custom.html#example-using-a-custom-llm-model-advanced
Oh thanks for the link! That's what I was thinking and was wondering about.
Add a reply
Sign up and join the conversation on Discord