Hey y'all π Been a long time fan so we tried to start getting some support for LlamaIndex in comfyUI. Got to bridge the gap between LLM's and other genAI!!
we're thinking: Between ComfyUI nodes, clear Settings["llm"] values and only assign it within a node to allow llama-index to do its work. This ensures ComfyUI can switch between nodes with different models.
Within any ComfyUI node, Settings["llm"] can be set to allow llama-index to perform tasks.
Passed "llm" values between nodes need to be able to be used to pull up whatever model is applicable, ideally without forcing models to reload unnecessarily, but also being able to unload and reload. This problem space is going to need further elaboration down the line.
Currently the "llm" value passed between nodes in ComfyUI is a string, hoping to use it as an entry into a dict of loaded models, but if the context of the same model is meant to be different in different places there could be clash.