I am exploring the sql query engine. For the SQL one I get: I'm sorry, but I can't execute SQL queries or access databases to provide real-time results. However, I can guide you on how to write the SQL query for the question you've asked. -- the pandas one works fine
I am creating an OpenAI assistant using a) llama index and b) native OpenAI API. In both cases I choose to upload some grounding documents, instruction prompt and enable the retrieval toolkit. However, in the case of (a) Llamaindex - quite bad texts are generated. Variant (b) openai native works like a charm though. What can be causeing such issues? I was expecting to get similar results - as internally the openAI APIs should be called in both cases
how can I use AutoModelForCausalLM.from_pretrained('TheBloke/leo-hessianai-7B-chat-GGUF', model_file="leo-hessianai-7b-chat.Q4_K_M.gguf", model_type="llama") based on ctransformers with llamaindex? this is currently failing with TheBloke/leo-hessianai-7B-chat-GGUF does not appear to have a file named pytorch_model.bin, tf_model.h5, model.ckpt or flax_model.msgpack. for the AutoModelForCausalLM which is coming from the normal transformers library which is used by llamaindex?