Find answers from the community

N
Naveen
Offline, last seen 3 months ago
Joined September 25, 2024
This is how I'm calling the query_engine:

Plain Text
#################################################################
# Defining a prompt
#################################################################
query =  "Summarize the following document. "
print("Querying...")
start_time = time.time()
response_stream=query_engine.query(query)
print(".")
print(".")
print(".")
response_stream.print_response_stream()
print(".")
print(".")
print(".")
elapsed_time = round((time.time() - start_time), 2)
print(f"Time taken for answer to generate: {elapsed_time} seconds.")


I get an error saying context_str isn't found. How do I point the query engine at the index data?
2 comments
N
L
Hey everyone. So I'm using ollama along with llamaindex. I followed the tutorial and docs and everything works fine until I try to edit the parameters like max_new_tokens. This is the code I'm using:

Plain Text
llm = Ollama(         base_url="https://localhost:11434",
                      model="mistral:instruct",
                      temperature=0.1, 
                      additional_kwargs={"max_new_tokens": 512},
                      context_window=3900)  


I get the following error:

Plain Text
httpx.ConnectError: [SSL] record layer failure (_ssl.c:1007)


So this wasn't my first pass at trying to change the model's parameters. Originally I ran this code without specifying the base_url part. This threw the following error at me:

Plain Text
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'http://localhost:11434/api/chat'
For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/400


So I specified the base_url with the https instead of http. THis only comes up as an issue if I specify the additional_kwargs argument. Does anyone know how I can fix this?
12 comments
N
L
Hey guys. Probably a dumb question, but is groq open source?
4 comments
L
N
G