Find answers from the community

Home
Members
Benjamin Bascary
B
Benjamin Bascary
Offline, last seen 3 months ago
Joined September 25, 2024
Hello everyone! Is there any callback handler or manager to access internal toughts of the agent? I cant find a solution.. does the managers from langchain works?
6 comments
L
B
Hey! I have a question about create-llama... When I execute que npx command, and in the final part when select the option for install dependencies, it fails because it cant find poetry... but when looking into the poetry docs, they dont recommend installing poetry globally.. thus, also this will install the deps globally instead of in the create-llama project, am I right?

Should I select the option for get only the code.. create manually a venv, install poetry, and the manually install the dependencies?
2 comments
B
L
Hey guys! Does someone knows alternatives to RagApp? I already know that Verba exists but it heavility depends on Weaviate and also I know that OpenGTPs from LangChain es out there but it does not got updates from a long time
2 comments
B
L
B
Benjamin Bascary
·

Chat

Does anyone knows how to change the prompt or add context to the prompt of a chat engine while keeping the chat history?

My code is as follows:

index = VectorStoreIndex.from_documents( documents, transformations=[text_splitter] ) llm = OpenAI(model="gpt-3.5-turbo", max_tokens=200, system_prompt="You are a dog") class RAG: def __init__(self, index: VectorStoreIndex): self.index = index.as_chat_engine(llm=llm, chat_mode="condense_plus_context") def chat(self, query: str): response = self.index.chat(query) return response.response
2 comments
B
L
Anyone knows why instantiating the SAME Bedrock llm agent in gradio works great and is able to call the tools, but when using the agent with the fastapi app provided by create-llama command line tool it is not able to call the tools, it just allucinates. It only works with OpenAI, but when switching to claude (for example) it is not able to use the tools. The Bedrock agent is able to find the tools but it is not able to receive the results when outputing the Observation, its just hallucinations. Its weird because in the gradio app it just works.

create-llama:
@r.post("/request") async def chat_request( data: _ChatData, chat_engine: BaseChatEngine = Depends(get_chat_engine), ): last_message_content, messages = await parse_chat_data(data) response = chat_engine.chat(last_message_content, messages) return _Result( result=_Message(role=MessageRole.ASSISTANT, content=response.response), nodes=_SourceNodes.from_source_nodes(response.source_nodes), )

Gradio app:

def run_agent(query: str) -> str: prompt = ... tool_names = [tool.metadata.name for tool in tools] tool_names_str = "\n".join(tool_names) qa_template = prompt.format(query=query, tools=tool_names_str) llm = Bedrock( model=os.getenv("MODEL_ID"), aws_access_key_id=os.getenv("BEDROCK_AWS_ACCESS_KEY"), aws_secret_access_key=os.getenv("BEDROCK_AWS_SECRET_KEY"), system_prompt=qa_template, temperature=0, region_name="us-east-1", ) agent = ReActAgent.from_tools(tools, llm=llm, verbose=True, max_iterations=40) response = agent.chat(qa_template) return response theme = gr.themes.Default( primary_hue="amber", )
2 comments
B
L