Find answers from the community

Updated 5 months ago

Does this mean I can t use llama index s

At a glance

The community member asked if they can use the Llama Index's query function without LangChain. Another community member responded that Llama Index uses some LLM classes from LangChain, and the error message suggests the need to import and use the ChatOpenAI class for the LLM being used. The original poster mentioned that the error only occurs when using Streamlit, and they don't know how to "use" the ChatOpenAI class in this context. After some discussion, the original poster was able to fix the issue, but the solution was not explicitly provided.

Does this mean I can't use llama index's query function without langchain?
L
K
5 comments
Llama-Index uses some LLM classes from langchain

To me, this error says you need to import/use the ChatOpenAI class for the LLM you are using
Thanks. It doesn't make sense though that I get the error only when using streamlit πŸ˜… I also imported it and nothing changed. I have to "use" it but I don't know how to use it in this context
What does the code in streamlit look like?
are you setting up a service context?
Ah...I fixed it. Thanks Logan! πŸ™‚
Add a reply
Sign up and join the conversation on Discord