Find answers from the community

Updated 9 months ago

hi everyone, wanted to ask about how

hi everyone, wanted to ask about how llamaindex calls on the LLM without any queries at all. assuming that my llm variable consists of the LLM i am calling, can i just call llm.query_engine()? or what do i have to do in this case? looking to chain multiple inputs together because i can't seem to get langchain to work and this will be very helpful. thank you!
W
k
5 comments
If you want to interact with llm directly. Try this: llm.chat() and llm.complete("ADD YOUR query here")
wait - just another question. let's say i have a prompt template here. how would i integrate that into this?
llm.complete("YOUR PROMPT... YOur query")

But do know that this is directly calling LLM
yeap, got it. thank you!
Add a reply
Sign up and join the conversation on Discord