Find answers from the community

Home
Members
mewmewtwo
m
mewmewtwo
Offline, last seen 3 months ago
Joined September 25, 2024
Hello guys. I have one question. When we're defining a custom LLM like here:
https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-using-a-custom-llm-model
Why are we using the text-generation task instead for example lets say the question answering? Is there a reason behind this? And can we use also question answering?
12 comments
m
L