Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 11 months ago
0
Follow
Hi everyone, how can I use API-URL as a
Hi everyone, how can I use API-URL as a
Inactive
0
Follow
A
Ar#9696
11 months ago
Β·
Hi everyone, how can I use API-URL as a LLM? Normally this is a post requests. How can I work on this instead of huggingface library model calling?
L
A
7 comments
Share
Open in Discord
L
Logan M
11 months ago
You can wrap your api requests with a custom LLM
https://docs.llamaindex.ai/en/stable/module_guides/models/llms/usage_custom.html#example-using-a-custom-llm-model-advanced
A
Ar#9696
11 months ago
Hi @Logan M, thanks for reaching out. Here in the place of dummy_response I can give my API-URL, right?
I.e dummy_response= 'HTTP://23.34.56:8001/model'
A
Ar#9696
11 months ago
Am I following right?
L
Logan M
11 months ago
Exactly, you would make an api request there
L
Logan M
11 months ago
requests.post(url) or what ever your api needs
A
Ar#9696
11 months ago
Thanks, will check that..
A
Ar#9696
11 months ago
@Logan M Btw LLM is not required for building service_context right? If we use LLM how it will develop context, if not used how data will looks like?
Add a reply
Sign up and join the conversation on Discord
Join on Discord