Find answers from the community

a
aiua
Offline, last seen 4 months ago
Joined September 25, 2024
a
aiua
·

Hi !

Hi !
I try to create RAG system using Mistral.
When I use mistral-large, I get below error message.
raise MistralAPIException.from_response(mistralai.exceptions.MistralAPIException: Cannot stream response. Status: 400
This only occurs on mistral-large. When I use mistral-medium, this doesn't occur.

Code is below.
Plain Text
Settings.llm = MistralAI(model='mistral-large',
                    safe_mode=True, max_tokens=150)
query_engine = RetrieverQueryEngine.from_args(
            retriever=retriever,
            node_postprocessors=[
                MetadataReplacementPostProcessor(target_metadata_key="window")
            ],
            streaming=False,
            similarity_top_k=2,
            text_qa_template=QA_PROMPT,

        )
response = query_engine.query(query)
2 comments
a
W
a
aiua
·

Hi.

Hi.
To check whether query message and context(include the information got from vector store) are folloiwng the content policy, I wanna know that how to call moderation api(Open AI API) before doing query.
2 comments
a
W
Hi! Why is open ai api called many times for one query?
The first response already returns the correct message. And message of second response is also same message.

**
Trace: query
|_CBEventType.QUERY -> 96.195128 seconds
|_CBEventType.RETRIEVE -> 1.351961 seconds
|_CBEventType.EMBEDDING -> 1.326677 seconds
|_CBEventType.SYNTHESIZE -> 94.842815 seconds
|_CBEventType.TEMPLATING -> 4.6e-05 seconds
|_CBEventType.LLM -> 48.121998 seconds
|_CBEventType.TEMPLATING -> 3.5e-05 seconds
|_CBEventType.LLM -> 46.620856 seconds
**
4 comments
a
L