Find answers from the community

Updated 2 months ago

anyone tried accessing llama_index.llms

anyone tried accessing llama_index.llms import Groq, i am not able to access this llm with
llama-index-llms-groq 0.1.3
llama-index-core 0.10.14
L
S
M
13 comments
What do you mean by "unable to access" ?
I just did installed all the requirements, but still i am not seeing the Groq object in llms
(base) sridharkannan@Sridhars-Air inference % pip list | grep llama
llama-hub 0.0.60
llama-index-agent-openai 0.1.5
llama-index-cli 0.1.6
llama-index-core 0.10.14
llama-index-embeddings-openai 0.1.6
llama-index-indices-managed-llama-cloud 0.1.3
llama-index-legacy 0.9.48
llama-index-llms-groq 0.1.3
llama-index-llms-openai 0.1.6
llama-index-llms-openai-like 0.1.3
llama-index-multi-modal-llms-openai 0.1.4
llama-index-program-openai 0.1.4
llama-index-question-gen-openai 0.1.3
llama-index-readers-file 0.1.6
llama-index-readers-llama-parse 0.1.3
llama-index-vector-stores-chroma 0.1.5
llama-parse 0.3.5
llamaindex-py-client 0.1.13
this is set of packages i have
but still i am not seeing the Groq object in llms -- are you actually running code, or relying on intellisense?

Works fine on google colab
https://colab.research.google.com/drive/1sQhOI7TN6CUfHp90Uvs8YmHmznpLp0qn?usp=sharing

Maybe start with a fresh venv
i just uninstall all this packages and installed again . now i can able to access
is it mandatory to v0.10.14 ... was it a recent update?
for groq, it is only in the latest versions
any snapshot how to move fast into the new ver from v0.9.x ... somehow it didn't work the other day ... so I wanna do it systematically (prbly one thing I missed out was the venv)
I'm going right on it ... for semantic chunks and Groq ... both vital πŸ‘πŸΌ
deeply appreciate LlamaIndex to stay in pace w/ the new
Add a reply
Sign up and join the conversation on Discord