For such indexes generated via stablellm, i am expecting very little usage of openai credits for embeddings. But I see davinci llm usage in openai report. So i am suspecting some of the indexes were older version generated with openai llm. Hence want to identify and regenerate them using stablellm
@Logan M - would it be possible to print name of the llm from the loaded index? (assuming the index is loaded from storage without passing any custom service context)
I tried index._service_context.llm_predictor.get_llm_metadata()
It seems to only show below details context_window=4097, num_output=-1)
Is there any similar helper function in service context to get the name or other details of the model used in that index?