Hi! Is it possible to run llama_index with only local components on a machine without internet access? I got this error on first import. Llama_index==0.7.20
I have it. This error appears, even before I create classes from llama_index. It appears right on import llama_index Because something wants to access the openai api from the __init__.py file.
it's because tiktoken (the tokenizer used on the hood to count tokens, etc.) is trying to be downloaded. You might have to look up how to pre-download and cache this tokenizer
Yea it's pretty baked into the codebase for counting tokens during chunking and other parts of llama-index. Kind of annoying, but it's tech-debt right now