Find answers from the community

Updated 9 months ago

Import

Is this import working ? from llama_index.llms import HuggingFaceLLM
i tried to install !pip install llama-index-llms-huggingface
but it still shows some error. I think recently some libraries were migrated
W
T
k
55 comments
Import is a bit incorrect

Try like this:
from llama_index.llms.huggingface import HuggingFaceLLM
Also if you just upgraded from v0.9 to v0.10 , I would recommend uninstalling llama index first and then installing it back
thanks i was following the notebooks so i was not sure of the imports
@WhiteFang_Jr - the import for the hugging face embedding model has also been changed right ? currently it is like this from llama_index.embeddings.HuggingFaceEmbedding import HuggingFaceEmbedding
Yes it is changed for embedding also. It'll be like this:
from llama_index.embeddings.huggingface import HuggingFaceEmbedding
Hi @WhiteFang_Jr it seems the imports are not working properly. I fail to import the settings or prompt template etc . I am following this notebook : (shared by llama index : https://colab.research.google.com/drive/1ZAdrabTJmZ_etDp10rjij_zME2Q3umAQ?usp=sharing#scrollTo=iiS0z1UxWgyt ). Not sure what is wrong . I just followed the steps mentioned.
Okay let me take a look! Will reply in 30 min actually away from machine at present. I hope that works πŸ™
thanks a lot ! i am also trying from my side just to see if there something is missing. Is there a page where all the changes are mentioned after the refactoring ?
All the changes not but if you go through this https://discord.com/channels/1059199217496772688/1073670729054294197/1207845501660168232

You'll get an idea how the code has been divided and merged.
thanks i think if i am correct all the changes are not reflected at this time.
Ah yes, This colab is not updated with latest imports and installation!
I will try to update the collab import in the morning and will raise the PR, btw from which docs did you find this colab link?
@WhiteFang_Jr this one i was looking into running mistral 7b instruct from llama index (run your own local models). Do you already know the changes in imports that has to be made ?
Not all but for some like BeautifulSoup one will require pypi installation: pip install llama-index-readers-web

Then import will become: from llama_index.readers.web import BeautifulSoupWebReader



Then
Plain Text
from llama_index.core import PromptTemplate
from llama_index.llms.huggingface import HuggingFaceLLM   # Need to install this from pypi

from llama_index.core import ServiceContext
from llama_index.core import VectorStoreIndex
from llama_index.core import SummaryIndex
from llama_index.core.response.notebook_utils import display_response
from llama_index.core.tools import QueryEngineTool, ToolMetadata
from llama_index.core.query_engine import RouterQueryEngine
These are few of them
all of the below one will also follow the same structure
thanks i am mostly trying to use the json reader so baiscally i just need some of the imports. thanks again
the installlation is a simple !pip install right ? nthing more to add up ?
Yes just pip install package name
thank you , a bit confused with the latest changes and not sure which module is working ?
I'll update this , could you share the exact doc link from where you got this colab link
i took it from here and chose the Open Source LLMs and mistral 7b 4bit.
Also wondering if the jsonloader function has been depreceated ?
saying no module named llama.index.llms.huggingface
previous cells
Attachment
image.png
Try restarting the session
@WhiteFang_Jr i dont think i made a typo
Hmm this is strange!, Just ran this:
Attachment
image.png
Try deleting this run time and trying one more time also add the pip install llama-index-llms-huggingface
how ab this one
Attachment
image.png
You can import prompts like this from llama_index.core import PromptTemplate, BasePromptTemplate

but I dont see SimpleInputPrompt there
was following a youtube video guide
@WhiteFang_Jr what should i change here?
Plain Text
TypeError                                 Traceback (most recent call last)
<ipython-input-5-fb0e7db497d0> in <cell line: 6>()
      4 """
      5 ## Default format supportable by LLama2
----> 6 query_wrapper_prompt=BasePromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")

TypeError: Can't instantiate abstract class BasePromptTemplate with abstract methods format, format_messages, get_template, partial_format
line 4 is just system prompt
Try defining with PromptTemplate class as the class you chose it the abstraction being used by all the prompt classes.

Plain Text
from llama_index.core import PromptTemplate

template = (
    "We have provided context information below. \n"
    "---------------------\n"
    "{context_str}"
    "\n---------------------\n"
    "Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)
@WhiteFang_Jr i have another error in my colab notebook, it seems like i need to change sth before but i have no idea where
could i dm you my notebook so you can check? ty
Plain Text
KeyError                                  Traceback (most recent call last)
<ipython-input-28-1f3ea2f80ab9> in <cell line: 1>()
----> 1 response=query_engine.query("Hello")

10 frames
/usr/local/lib/python3.10/dist-packages/llama_index/core/prompts/base.py in format(***failed resolving arguments***)
    194 
    195         mapped_all_kwargs = self._map_all_vars(all_kwargs)
--> 196         prompt = self.template.format(**mapped_all_kwargs)
    197 
    198         if self.output_parser is not None:

KeyError: 'prompt'
Add a reply
Sign up and join the conversation on Discord