together.ai
interface is currently not available in llamaindex.service_context
https://docs.llamaindex.ai/en/stable/module_guides/models/llms/usage_custom.html#example-using-a-custom-llm-model-advancedollama run ...
with together.aiollama
we just need to pul the model we want to use only. We dont need the command ollama run mistral
correct?ollama_pack = OllamaQueryEnginePack(model="dolphin-phi", documents=documents)
model="perplexity-mistral-7b-instruct"