Find answers from the community

Updated 2 months ago

Hello πŸ‘‹ I am using a local llama LLM

Hello πŸ‘‹ . I am using a local llama LLM and doing RAG. Is there some way to retrieve the data in a structured format (JSON, dict, ...) without going through OpenAI?
b
v
6 comments
Pretty sure you can use any of the other ones here: https://gpt-index.readthedocs.io/en/stable/core_modules/query_modules/structured_outputs/pydantic_program.html but it's not as garuanteed/accurate as openAI' is
@bmax ty! I will take a look at them. looks like the Text Completion docs are under construction, but the guidance is something I haven't come accross yet - I'll give it a shot...
Hey @bmax , can you please tell me how I can incorporate these "programs" either the OpenAIPydanticProgram or any other, into a flow with documents/nodes returned from the index? I don't see where these "programs" fit in to any flow, all examples with them are on simple text
Yes, it does not incorporate into an index, it's just a single LLM call.
So one thing you can do is get the response from an index and pass it into Program(
ok, thanks
Add a reply
Sign up and join the conversation on Discord