Find answers from the community

Updated 3 months ago

Hi all! I want to generate question and

Hi all! I want to generate question and context pairs using "from llama_index.core.evaluation import generate_question_context_pairs". So, generate_question_context_pairs, needs an LLM to generate it. I am using llama13b deployed on my private premises which gives output through an API call. I want to use it as my llm to generate those pairs. I could see from the docs that it takes OpenAI models. So how should I use the llma13b. Following is my API that calls llama13b

class LLAMA2ApiWrapper:
def init(self, api_url = "XXXXXXXXXXXXXXXXXX",):
self.api_url = api_url
self.headers = {'Content-Type': 'application/json'}


def make_api_request(self, input_text):
# input_text = self.prompt.format(query=input_text)
data = {
"inputs": [
{
"name": "prompt",
"shape": [1],
"datatype": "BYTES",
"data": [input_text]
}
]
}

try:
response = requests.post(url = self.api_url, data=json.dumps(data), headers=self.headers)
response.raise_for_status() # Raise an HTTPError for bad responses (4xx and 5xx)

if response.json() is None:
raise ValueError("Returned None String")

result = response.json()['outputs'][0]['data'][0].split("Question:")[-1][1:]
return result

except requests.exceptions.RequestException as e:
print(f"Error making API request: {e}")
return None

Some helping pointers will be appreciated!
W
1 comment
You can try OpenAILiKe and pass the API url , it should work then
Check this doc: https://docs.llamaindex.ai/en/stable/examples/llm/localai.html#localai
Add a reply
Sign up and join the conversation on Discord