Hi guys I have a question: im using llama_index in a python script for an GPT-bot with custom data. But i cant really see where it access the GPT-API and which gpt-modell is used.
i have an instance build with llamaindex and a custom data-source. but it doesn`t seem to remember the question I asked before. So it isn't a related conversation. Does anyone now a solution to do that?
i have an instance build with llamaindex and a custom data-source. but it doesn`t seem to remember the question I asked before. So it isn't a related conversation. Does anyone now a solution to do that?