Find answers from the community

Updated 2 months ago

Hi I m trying to understand the

Hi! I'm trying to understand the difference in output I'm seeing between using OpenAI vs a model hosted on HuggingFace (when I create an llm for the service context using HuggingFaceLLM) using the NLSQLTableQueryEngine. If I use OpenAI directly, I get a whole preamble of:
Given an input question, first create a syntactically correct sqlite query to run, then look at the results of the query and return the answer with a lot more prompt prepping openai for a response.
But if I use a model and wrap it around a HuggingFaceLLM class, i only get the SQL tables dumped out to the LLM.
Overall performance of non-OpenAI models have been poor in comparison (which might be a no-duh" comment, even using the 70B llama2) when using NLSQLTableQueryEngine but im just trying to make sure ive explored using this path as much as i can before falling back to openai
L
c
5 comments
Both models should be given the same preamble πŸ€” The prompts are not dependant on the model

Curious what you used to check this?
I enabled logging as such:
Plain Text
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logging.getLogger().addHandler(logging.StreamHandler(stream=sys.stdout))

Captured the outputs for both OpenAI and when using the HuggingFaceLLM class
I thiiiiink you might be comparing two separate logging events for openai vs. huggingface

There is no console log for what is sent to the huggingface LLM (at least from the code that I'm looking at)

I would suggest using the token counting handler to see exactly what is sent to each LLM

https://gpt-index.readthedocs.io/en/stable/examples/callbacks/TokenCountingHandler.html#token-counting-handler
I will say this, I only really notice it in the content user role in the request to OpenAI. The rest of the output is the same. So it might just be that I am seeing more in the OpenAI logging because it is logging the full request to OpenAI
Ok so I think we're saying the same thing. I'll confirm via counting handler though. Thank you for the sanity check.
Add a reply
Sign up and join the conversation on Discord