Hi, need feedback on the following
If I were to use any opensource LLM for generating responses. GPU Size constraint for the LLM is 24GB. I'm looking into using opensource LLM in place of OpenAI.
Tried the following
- Camel 5B
- Stable LM 3B
- dolly-v2-3B
What do you guys suggest.
Feedback highly appreciated! Thanks!