Find answers from the community

Updated 6 days ago

Troubleshooting Llama Index Core Attribute Error

if llama_index.core.global_tokenizer is None:
112 tiktoken_import_err = (
113 "tiktoken package not found, please run pip install tiktoken"

AttributeError: module 'llama_index' has no attribute 'core'

anybody know about this I have made installation still its saying no attribute 'core'
creating MultiModalVectorStoreIndex
L
D
7 comments
Feel free to pass along the colab thats causing the issue
I have refreshed the installation but the issue is persist
Well, you have a different error now

cannot import name 'ImageBlock' from 'llama_index.core.llms'
I removed langchain from your installs, and then it works fine
You shouldn't need langchain at all, LlamaIndex has packages for everything
Oky let me try
Add a reply
Sign up and join the conversation on Discord