Find answers from the community

Updated last year

is it possible to get time info from a

is it possible to get time info from a auto-merge rag in llama_index?
L
1 comment
I would also like to pass it a custom chat template (ChatML) to use. Is that possible?
here is my code snippet. @Teemu πŸ€“
Plain Text
llm = OpenAILike(
    # model="dpo-hermes",
    model="gpt-3.5-turbo",
    is_chat_model=False,
    # is_chat_model=True,
    timeout=60,
    is_function_calling_model=False,
    temperature=0.01,
    api_key="sk-dummy",
    api_base="http://dpo-hermes:8081/v1",
    max_tokens=128,
    verbose=True,
    # repetition_penalt
    repetition_penalty=1.5
)
service_context = ServiceContext.from_defaults(
    llm=llm,
    embed_model=embed_model,
)
node_parser = HierarchicalNodeParser.from_defaults(chunk_sizes=chunk_sizes)
nodes = node_parser.get_nodes_from_documents(documents)
leaf_nodes = get_leaf_nodes(nodes)
storage_context = StorageContext.from_defaults()
storage_context.docstore.add_documents(nodes)

# create vector store index
index = VectorStoreIndex(
    leaf_nodes, storage_context=storage_context, service_context=service_context
)
base_retriever = index.as_retriever(similarity_top_k=similarity_top_k)
        
retriever = AutoMergingRetriever(
    base_retriever, index.storage_context, verbose=True
)
auto_merging_engine = RetrieverQueryEngine.from_args(
    retriever,
    service_context=service_context,
    node_postprocessors=[rerank_model],
    response_mode="compact",
)
auto_merge_rag_response = auto_merging_engine.query(query_prompt)
Add a reply
Sign up and join the conversation on Discord