Import is a bit incorrect
Try like this:
from llama_index.llms.huggingface import HuggingFaceLLM
Also if you just upgraded from v0.9 to v0.10 , I would recommend uninstalling llama index first and then installing it back
thanks i was following the notebooks so i was not sure of the imports
@WhiteFang_Jr - the import for the hugging face embedding model has also been changed right ? currently it is like this from llama_index.embeddings.HuggingFaceEmbedding import HuggingFaceEmbedding
Yes it is changed for embedding also. It'll be like this:
from llama_index.embeddings.huggingface import HuggingFaceEmbedding
Okay let me take a look! Will reply in 30 min actually away from machine at present. I hope that works π
thanks a lot ! i am also trying from my side just to see if there something is missing. Is there a page where all the changes are mentioned after the refactoring ?
thanks i think if i am correct all the changes are not reflected at this time.
Ah yes, This colab is not updated with latest imports and installation!
I will try to update the collab import in the morning and will raise the PR, btw from which docs did you find this colab link?
@WhiteFang_Jr this one i was looking into running mistral 7b instruct from llama index (run your own local models). Do you already know the changes in imports that has to be made ?
Not all but for some like BeautifulSoup one will require pypi installation:
pip install llama-index-readers-web
Then import will become:
from llama_index.readers.web import BeautifulSoupWebReader
Then
from llama_index.core import PromptTemplate
from llama_index.llms.huggingface import HuggingFaceLLM # Need to install this from pypi
from llama_index.core import ServiceContext
from llama_index.core import VectorStoreIndex
from llama_index.core import SummaryIndex
from llama_index.core.response.notebook_utils import display_response
from llama_index.core.tools import QueryEngineTool, ToolMetadata
from llama_index.core.query_engine import RouterQueryEngine
all of the below one will also follow the same structure
thanks i am mostly trying to use the json reader so baiscally i just need some of the imports. thanks again
the installlation is a simple !pip install right ? nthing more to add up ?
Yes just pip install package name
thank you , a bit confused with the latest changes and not sure which module is working ?
I'll update this , could you share the exact doc link from where you got this colab link
i took it from here and chose the Open Source LLMs and mistral 7b 4bit.
Also wondering if the jsonloader function has been depreceated ?
saying no module named llama.index.llms.huggingface
Try restarting the session
@WhiteFang_Jr i dont think i made a typo
Hmm this is strange!, Just ran this:
Try deleting this run time and trying one more time also add the pip install llama-index-llms-huggingface
You can import prompts like this from llama_index.core import PromptTemplate, BasePromptTemplate
but I dont see SimpleInputPrompt there
was following a youtube video guide
@WhiteFang_Jr what should i change here?
TypeError Traceback (most recent call last)
<ipython-input-5-fb0e7db497d0> in <cell line: 6>()
4 """
5 ## Default format supportable by LLama2
----> 6 query_wrapper_prompt=BasePromptTemplate("<|USER|>{query_str}<|ASSISTANT|>")
TypeError: Can't instantiate abstract class BasePromptTemplate with abstract methods format, format_messages, get_template, partial_format
line 4 is just system prompt
Try defining with PromptTemplate class as the class you chose it the abstraction being used by all the prompt classes.
from llama_index.core import PromptTemplate
template = (
"We have provided context information below. \n"
"---------------------\n"
"{context_str}"
"\n---------------------\n"
"Given this information, please answer the question: {query_str}\n"
)
qa_template = PromptTemplate(template)
@WhiteFang_Jr i have another error in my colab notebook, it seems like i need to change sth before but i have no idea where
could i dm you my notebook so you can check? ty
KeyError Traceback (most recent call last)
<ipython-input-28-1f3ea2f80ab9> in <cell line: 1>()
----> 1 response=query_engine.query("Hello")
10 frames
/usr/local/lib/python3.10/dist-packages/llama_index/core/prompts/base.py in format(***failed resolving arguments***)
194
195 mapped_all_kwargs = self._map_all_vars(all_kwargs)
--> 196 prompt = self.template.format(**mapped_all_kwargs)
197
198 if self.output_parser is not None:
KeyError: 'prompt'