Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Active
Updated 4 weeks ago
0
Follow
Getting started with llamaindex for making llm calls and deploying
Getting started with llamaindex for making llm calls and deploying
Active
0
Follow
a
adenml
4 weeks ago
ยท
I am very new to llamaindex, where to start for a
Making a chain of LLM calls with loops
deploying and seeing how it behaves with multiple requests in the same time
Could you please indicate the core concepts/docs that would help me?
a
W
2 comments
Share
Open in Discord
a
adenml
4 weeks ago
Can I do it with structured outputs from openai as well? Is Llamaindex providing wrappers for that too?
W
WhiteFang_Jr
4 weeks ago
Yep you can totally do that! Just need to pass
strict=True
https://docs.llamaindex.ai/en/stable/examples/llm/openai/#function-calling
Add a reply
Sign up and join the conversation on Discord
Join on Discord