Find answers from the community

Updated 2 years ago

Can you please help with workaround

At a glance

The community member is seeking a workaround to stop a call or use an alternative open-source model. Another community member responds that this is an error on a core component of llama-index and there is no easy workaround. They provide an example code snippet involving the tiktoken library and the gpt2 encoding, which may help reproduce the issue.

Can you please help with workaround where I want to stop this call and/or use any other open source model/
L
1 comment
This is a an error on a core component of llama-index πŸ˜… No easy workaround here

Basically, this is what needs to work, and running this will likely reproduce the issue

Plain Text
import tiktoken
tokenizer = tiktoken.get_encoding("gpt2").encode
Add a reply
Sign up and join the conversation on Discord