Find answers from the community

Updated 9 months ago

Hi! How can I set up the OpenAI

Hi! How can I set up the OpenAI temperature? Despite I've set it up explicitly, in the debug it's always 0.1. Here is the code:
Plain Text
llm_engine = OpenAI(temperature=0, model=model_name, api_key=openai_key, max_tokens=response_limit, callback_manager=callback_manager)

Are there other places to set it up? Thanks!
L
S
2 comments
Its working fine

Plain Text
>>> from llama_index.llms.openai import OpenAI
>>> llm = OpenAI(temperature=0)
>>> llm.temperature
0.0
>>> llm._get_model_kwargs()
{'model': 'gpt-3.5-turbo', 'temperature': 0.0}
>>> 


But perhaps some component in your system is defaulting to the default LLM. What do you with that llm variable?
I use it here:
Plain Text
query_engine = index.as_chat_engine(chat_mode='context', 
                     similarity_top_k=similarity_top_k, 
                     llm=llm_engine,
                     system_prompt=prepared_system_prompt)

And because I have the env var showing the OpenAI debugging information, I can see in the terminal this:
Plain Text
Request options: {'method': 'post', 'url': '/chat/completions', 'files': None, 'json_data': {'messages': [{'role': 'system', 'content': '---------------------\nBelow is private information: '}, {'role': 'user', 'content': 'pricing'}], 'model': 'gpt-3.5-turbo', 'stream': False, 'temperature': 0.1}}
Add a reply
Sign up and join the conversation on Discord