Find answers from the community

Home
Members
PyMangekyo
P
PyMangekyo
Offline, last seen 3 months ago
Joined September 25, 2024
Hi all! I want to generate question and context pairs using "from llama_index.core.evaluation import generate_question_context_pairs". So, generate_question_context_pairs, needs an LLM to generate it. I am using llama13b deployed on my private premises which gives output through an API call. I want to use it as my llm to generate those pairs. I could see from the docs that it takes OpenAI models. So how should I use the llma13b. Following is my API that calls llama13b

class LLAMA2ApiWrapper:
def init(self, api_url = "XXXXXXXXXXXXXXXXXX",):
self.api_url = api_url
self.headers = {'Content-Type': 'application/json'}


def make_api_request(self, input_text):
# input_text = self.prompt.format(query=input_text)
data = {
"inputs": [
{
"name": "prompt",
"shape": [1],
"datatype": "BYTES",
"data": [input_text]
}
]
}

try:
response = requests.post(url = self.api_url, data=json.dumps(data), headers=self.headers)
response.raise_for_status() # Raise an HTTPError for bad responses (4xx and 5xx)

if response.json() is None:
raise ValueError("Returned None String")

result = response.json()['outputs'][0]['data'][0].split("Question:")[-1][1:]
return result

except requests.exceptions.RequestException as e:
print(f"Error making API request: {e}")
return None

Some helping pointers will be appreciated!
1 comment
W
from llama_index.embeddings import LangchainEmbedding

Not able to import LangchainEmbedding from LangchainEmbedding
Using LlamaIndex 🦙 v0.10.11.post1. Why is it happening! Documentation says it is doable https://docs.llamaindex.ai/en/stable/examples/embeddings/Langchain.html
12 comments
P
W
I am new to llama_index. I don't understand why is it so inconsistent with the documentation. Can anyone please tell? Langchain is far more sorted I felt.
2 comments
P
W
from llama_index.embeddings import LangchainEmbedding

Not able to import LangchainEmbedding from LangchainEmbedding
Using LlamaIndex 🦙 v0.10.11.post1. Why is it happening! Documentation says it is doable https://docs.llamaindex.ai/en/stable/examples/embeddings/Langchain.html
2 comments
P
L
What is the issue with llama index? I am getting this error for latest version 0.10.11

ImportError Traceback (most recent call last)
Cell In[2], line 2
1 # from llama_index.core.evaluation import generate_question_context_pairs
----> 2 from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, ServiceContext
3 from llama_index.core.node_parser import SimpleNodeParser
4 from llama_index.core.evaluation import generate_question_context_pairs

ImportError: cannot import name 'VectorStoreIndex' from 'llama_index.core' (unknown location)
5 comments
P
L
W