Hi Team, in langchain we have the following runnable functions, do we have anything similar to this, like runnabel functions in llamaindex - @abstractmethod def invoke(self, input: Input, config: Optional[RunnableConfig] = None) -> Output: """Transform a single input into an output. Override to implement.
Args: input: The input to the Runnable. config: A config to use when invoking the Runnable. The config supports standard keys like 'tags', 'metadata' for tracing purposes, 'max_concurrency' for controlling how much work to do in parallel, and other keys. Please refer to the RunnableConfig for more details.
@logan thanks for asking, i found the root cause, this is to execute two process in parallel , i did not give the right prompt and so i end up seeing this error, updated the prompt that resolves the error that occurred while making a call to LLM