Find answers from the community

Updated last year

Claude LLM

Can I create GPTVectorStoreIndex using Claude LLM in service context?
Before I have used OpenAI LLMs.
E
o
W
11 comments
Plain Text
from llama_index import ServiceContext
from llama_index.llms import Anthropic

service_context = ServiceContext.from_defaults(llm=Anthropic())
yes, but there, streaming would work like before?
llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0.7, model_name="gpt-3.5-turbo",streaming=True))

Before we used this
Yes, it should work.
Just replace ChatOpenAI with Anthropic
service_context = ServiceContext.from_defaults(llm=Anthropic(streaming=True))

can I do like this?
You can set Streaming=True on as_query_engine and it should work
Plain Text
query_engine = index.as_query_engine(streaming=True)
streaming_response = query_engine.query("Who is Paul Graham.")

https://gpt-index.readthedocs.io/en/stable/core_modules/query_modules/query_engine/root.html#usage-pattern
llm_predictor = Anthropic()

service_context = ServiceContext.from_defaults(
llm_predictor=llm_predictor,
chunk_size=1024
)

I have defined service context like that, and created Index.
index = GPTVectorStoreIndex.from_documents(document,service_context=service_context)

query_engine = index.as_query_engine(response_mode="compact", similarity_cutoff=0.6, similarity_top_k=2,streaming=True, text_qa_template=QA_PROMPT,)

response = query_engine.query("How are you?")

But I got this error.

'Anthropic' object has no attribute 'stream'

** Using OpenAI's models, it worked perfectly.
Plain Text
from llama_index import LLMPredictor, ServiceContext
from langchain.llms import Anthropic

llm_predictor = LLMPredictor(llm=Anthropic())

service_context = ServiceContext.from_defaults(
    llm_predictor=llm_predictor
)

index = GPTVectorStoreIndex.from_documents(document,service_context=service_context)

query_engine = index.as_query_engine(streaming=True)

streaming_response = query_engine.query("Who is Paul Graham.")
need to adapt to your documents
Add a reply
Sign up and join the conversation on Discord