Find answers from the community

a
amar
Offline, last seen 2 months ago
Joined September 25, 2024
I hope it's ok to bump an old issue -- the the only solution to this to modify llamaindex source?
3 comments
L
Hey folks, on the topic of prompts, it seems like my ServiceContext.from_defaults(query_wrapper_prompt="bla") is being ignored. Could there be something obvious I'm missing?
7 comments
L
a
In fact, is this even a problem? Does OpenAI ignore a second system prompt?
4 comments
a
L
Hi folks! I saved a pretty meaty index via index.storage_context.persist(), and I'm loading it with load_index_from_storage. Is there a way to change the ServiceContext on that? As I'd like to use a different LLM after indexing. Thanks!
4 comments
a
L