Log in
Log into community
Find answers from the community
Most posts
Powered by
Hall
Home
Members
mewmewtwo
m
mewmewtwo
Offline
, last seen 3 months ago
Joined September 25, 2024
Contributions
Comments
Replies
m
mewmewtwo
2 years ago
·
Custom LLM pipeline
Hello guys. I have one question. When we're defining a custom LLM like here:
https://gpt-index.readthedocs.io/en/latest/how_to/customization/custom_llms.html#example-using-a-custom-llm-model
Why are we using the text-generation task instead for example lets say the question answering? Is there a reason behind this? And can we use also question answering?
12 comments
m
L