Find answers from the community

Updated last year

Components

is it possible to use llama index as a utility tool kit, meaning, instead of calling chains... LLMChain and PromptTemplate, call OpenAI openai.ChatCompletion.create directly? And manually create my context, memory, etc?
L
f
5 comments
You can use every single component in llamaindex individually.

You can call the base level LLM
https://gpt-index.readthedocs.io/en/stable/core_modules/model_modules/llms/usage_standalone.html

Basically any base level component can be used like this (retrievers, response synthesizers, memory, node postprocessors, embeddings, and LLMs)
Thanks, is it possible to use llama index without the base level LLM and just OpenAI's library?
Not really. Llama index needs to know how to call openai, and that's with the llm object

You could just write your own LLM class if there was something you wanted to customize
no_text: Only runs the retriever to fetch the nodes that would have been sent to the LLM, without actually sending them. Then can be inspected by checking response.source_nodes. The response object is covered in more detail in Section 5.

Would no_text and then calling openai just work for me?
Yea that works, just taking the text from the nodes and figuring it out from there
Add a reply
Sign up and join the conversation on Discord