Find answers from the community

Updated 2 years ago

Anyone know if it would be hard to

At a glance
Anyone know if it would be hard to convert my knowledge based chatbot that uses OpenAi to use llama2 instead (while still using llamaindex of course)? I would like to see if it works better than open AI for my app
L
M
8 comments
llama2 is almost certainly worse than openai πŸ˜…

But we do support llama2 a few ways. Would you be running it locally? Using some other API?
locally. Currently i just have my openai api key set and use this code:

index = create_index(documents)
chat_engine = index.as_chat_engine(chat_mode="react", verbose=False)

#Send query to bot
response = chat_engine.chat(
"Use the tool to answer "
+ user_input
)



create_index just uses GPTVectorStoreIndex
I would like to change the bot to llama2 to try it out if i can
What kinds of resources do you have locally?
PDfs that are converted to a text document. llamaindex then creates an index and runs the chatbot. It is all in a python flask application
I mean, if you want to run llama2 locally, do you have a large GPU?
not necessarily. Does it take a lot more processing power than open AI does? I have the flask app running on a server working well with open ai and i test it locally. I thought i would be able to use a api key with llama2 but i guess thats not the case
Yea you'll need pretty powerful hardward to run llama2 locally πŸ˜…

Another option is getting an API key from replicate and using thei API

https://gpt-index.readthedocs.io/en/stable/examples/vector_stores/SimpleIndexDemoLlama2.html
great thank you for your help
Add a reply
Sign up and join the conversation on Discord