Find answers from the community

Updated 5 months ago

Can you use LLama 2 locally withoout replicate or openAI ?

At a glance
Can you use LLama 2 locally withoout replicate or openAI ?
k
2 comments
Please stand by @Steve_h_112 as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
Yes, you can use LlamaIndex locally without relying on OpenAI. By default, LlamaIndex sends your data to OpenAI for generating embeddings and natural language responses. However, this can be configured according to your preferences. LlamaIndex provides the flexibility to use your own embedding model or run a large language model locally if desired.

For example, in the context of PII Masking, it is recommended to use a local LLM model. Here is an example of how you can set up a local service context:

Plain Text
service_context = ServiceContext.from_defaults()
processor = PIINodePostprocessor(service_context=service_context)


You can find more examples for local LLMs here ().

Please note that the privacy details and handling of your data are subject to the policies of the service you choose to use. Each custom service other than OpenAI have their own policies as well.

Source ()
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
Add a reply
Sign up and join the conversation on Discord