Find answers from the community

Updated 2 months ago

Hi, how do you use

Hi, how do you use HuggingFaceTextGenInference with query_wrapper_prompt?
The only way I can do it is to use the deprecated LLMPredictor
Plain Text
llm = HuggingFaceTextGenInference(...)
query_wrapper_prompt = PromptTemplate("...")
llm_predictor = LLMPredictor(llm=llm, query_wrapper_prompt=query_wrapper_prompt)

This give the warning when used: LLMPredictor is deprecated, please use LLM instead. and still works for simple cases.

But LLMPredictor cant seem to work with UnstructuredElementNodeParser
Plain Text
node_parser = UnstructuredElementNodeParser()
node_parser.llm = self.llm_predictor

Doing the above gives the error ValueError: "LLMPredictor" object has no field "callback_manager"
Any advice please? Thanks
L
g
4 comments
I think I replied to your github issue, but let me know if anything else comes up
@Logan M Thanks! I get an error when trying your suggestion on llama-index==0.9.15 its crashing with

Plain Text
TypeError: LangChainLLM.__init__() got an unexpected keyword argument 'query_wrapper_prompt'

In https://github.com/run-llama/llama_index/blob/main/llama_index/llms/langchain.py I dont see query_wrapper_prompt in the constructor.
ah yea oof, I missed that kwarg when refactoring. Technically though that kwarg is deprecated. I would do instead (example for llama2 format, you could swap in whatever processsing needed for your LLM)

Plain Text
def completion_to_prompt(prompt):
  return f"[INST] {prompt} [/INST] " 

llm = LangChainLLM(...., completion_to_prompt=completion_to_prompt)
Add a reply
Sign up and join the conversation on Discord