graph = ComposableGraph.from_indices(
GPTListIndex,
index_arr,
index_summaries=summaries,
service_context=service_context
)
Hi when I add service_context to
ComposableGraph.from_indices
I always get this error
ValueError: Got a larger chunk overlap (20) than chunk size (-42), should be smaller.
This is my service_context:
# set maximum input size
max_input_size = 256
# set number of output tokens
num_outputs = 256
# set maximum chunk overlap
max_chunk_overlap = 20
chunk_size_limit = 512
# define LLM
llm_predictor = LLMPredictor(llm=OpenAI(temperature=0.2, model_name="gpt-3.5-turbo", max_tokens=num_outputs))
prompt_helper = PromptHelper(max_input_size, num_outputs, max_chunk_overlap, chunk_size_limit=chunk_size_limit)
service_context = ServiceContext.from_defaults(llm_predictor=llm_predictor, prompt_helper=prompt_helper, chunk_size_limit=chunk_size_limit)