Find answers from the community

Updated 5 months ago

how to implement nemo guardrails over chat engine with streaming responses

At a glance

The post asks how to implement Nemo guardrails over a chat engine with streaming responses. The comments indicate that the community members do not have specific information on this topic in the context of LlamaIndex. One community member suggests checking the official documentation or reaching out to technical support for more guidance. Another community member shares a GitHub link related to Nemo guardrails and LlamaIndex, but notes that the implementation will need to be explored on one's own. There is no explicitly marked answer in the provided information.

Useful resources
how to implement nemo guardrails over chat engine with streaming responses
k
p
W
6 comments
Please stand by @payload as I search the knowledge base. I normally take about 20 seconds or so. :wave:
----------------------
I'm sorry, but the provided knowledge sources do not contain information on how to implement Nemo guardrails over a chat engine with streaming responses in the context of LlamaIndex. I recommend checking the official documentation or reaching out to the relevant technical support for more specific guidance.
----------------------
Note: you can ask me a follow up question by @ mentioning me again :speech_balloon:
----------------------
@Logan M @WhiteFang_Jr
from nemoguardrails import LLMRails,

just the LLMRails should be assigned as Settings.llm ?
which guardrail implementation do u recommand
This you'll have to explore on your own tbh๐Ÿ˜… I havent used it but had seen this in the past shared somewhere so share it with you.
Add a reply
Sign up and join the conversation on Discord