Find answers from the community

Updated last year

Hi All I am trying to use text to sql

At a glance

A community member is trying to use the text-to-SQL feature of LlamaIndex, but they cannot use the OpenAI API due to confidentiality concerns and do not have access to a GPU in their deployment environment. They are running it using a local Llama-2 7B chat model via Hugging Face, but they are unable to run it on a CPU. Another community member suggests a tutorial notebook on running local Llama 2, but the original poster later states that they have figured it out.

Useful resources
Hi All, I am trying to use text-to-sql feature of LlamaIndex. However, I can't use openai API because of the confidentiality of my data/schema. Neither will I have access to a gpu in the target deployment environment. I am running it using a local Llama-2 7B chat model via huggingface. Is it possible to run everything on a cpu?

I tried setting device_map to "cpu" when I instantiate HuggingFaceLLM but it doesn't work. Appreciate your help!
d
A
2 comments
Could you share how you are doing this? (ideally with stack trace).

This is a tutorial notebook on running local llama 2: https://docs.llamaindex.ai/en/stable/examples/vector_stores/SimpleIndexDemoLlama-Local.html#local-llama2-vectorstoreindex
Nevermind! I figured it out. Thanks for offering to help!
Add a reply
Sign up and join the conversation on Discord