Find answers from the community

Updated 2 months ago

Llm

Hello I am trying to change the underlying LLM, but am new and this example is not very helpful to me: https://gpt-index.readthedocs.io/en/latest/core_modules/model_modules/llms/usage_custom.html#example-changing-the-underlying-llm

I'm wanting to use GPT4All and am looking at the langchain intructions too... but do not know how to put it all together: https://python.langchain.com/docs/integrations/llms/gpt4all, https://python.langchain.com/docs/integrations/providers/gpt4all
L
2 comments
Plain Text
from llama_index import ServiceContext, set_global_service_context
from llama_index.llms import LangChainLLM

llm = LangChainLLM(<lc_llm_here>)

service_context = ServiceContext.from_defaults(llm=llm)

set_global_service_context(service_context)
Should be something like that to use a langchain llm?
Add a reply
Sign up and join the conversation on Discord