Hey quick question on loading locally stored knowledge graphs - should I store with storage_context.to_dict() or with persist()? And how do I load the knowledge graph from these files? Recommendations?
Have you seen studies whether (and for which models) it matters if the retrieved data is fed as a user, system or assistant message to OpenAI chat api?
I've got an issue where I can form a Qdrant index but cannot use the reader, it gives a "name or service not known" error. Seems to come from the Qdrant client. Anyone working with it?