I would say Yes but you'll have to look into whether OpenAI embedding model support creation of embeddings for Japanese characters or not. Or how's the quality for generating content in Japanese using OpenAI.
You could also select open source LLMs and embedding models which support Japanese characters.
You can then combine the llm and embed model in llamaindex and start chatting with your docs π¦
I see. My use case is to create a language tutor using Open AI models. Basically, the "bot" will ask questions, and the user answers with the translation. The bot should be able to determine if the translation is correct. Is this possible?
It totally depends on how greatly the model is able to predict the translation. Also the questions that bot will ask, will they be with you stored somewhere?