Find answers from the community

Home
Members
erik_squared
e
erik_squared
Offline, last seen 3 months ago
Joined September 25, 2024
i have a question about llama-2 and generate_question_context_pairs

when i use mistral-instruct, everything works well - but with llama-2, the generated response is often prefaced with Great! Here are two questions based on the provided context: - which ends up as a question in the qa_dataset.

Has anyone else bumped into / worked around this? I am using LLamaCPP to instantiate the llm object, and using the default messages_to_prompt / completion_to_prompt
2 comments
e
L
question about langfuse integration: if i wanted to retrieve the trace_id to subsequently add a user-inputted score to a trace, one example i've seen sets a root trace before invoking llama-index
Plain Text
langfuse_callback_handler = LlamaIndexCallbackHandler()
Settings.callback_manager = CallbackManager([langfuse_callback_handler])

def my_func():
  # Create a new trace on your main execution path
  root_trace = langfuse.trace(name="trace-name")
 
  # Set the root trace, subsequent LlamaIndex observations will be nested under root

  langfuse_callback_handler.set_root(root_trace)
 
  # Your LlamaIndex code here
 
  # Reset root, subsequent LlamaIndex observations will now use the default grouping and trace creation

  langfuse_callback_handler.set_root(None)


is this thread-safe? or would we need to create and pass a new callback handler/manager for each invocation?

or is there a simpler way to retrieve the trace_id so that it can be manually updated via langfuse sdk later?
2 comments
e
L
question about the callbackmanager->instrumentation migration - i'm working with RetrieverQueryEngine and CondensePlusContextChatEngine with a custom retriever and using LangFuse for observability. I am getting different trace spans between the 2, and looking at the code for each, they seem to be using different tracing constructs.

trying to figure out which one might be more "current" as regards the migration to the instrumentation package.
1 comment
L