Find answers from the community

Updated 4 months ago

Prompt engineering

At a glance

The post describes a prompt that randomly selects either "dump" or "match" with a 30% probability for "dump" and 70% probability for "match". The comments discuss the use case for this prompt, with community members questioning why the random module in Python couldn't be used instead. The community members also suggest trying different prompts and temperature settings to achieve the desired distribution of "dump" and "match" responses. However, there is no explicitly marked answer in the provided information.

prompt:
Plain Text
Please respond by randomly selecting either only "dump" or "match" for each interaction, with a 30% probability for "dump" and a 70% probability for "match".


I set prompt like above
C
d
6 comments
Hmm. This doesn’t seem like a typical usecase for LLMs. Can I ask why you want to do something like this? Why not just use pythons random module?
Well I wanna know we can make the random with prompt.
I am gonna build like this. response is generated "match" or "dump". If "match" continue next question
I watched the video, but I’m afraid I still don’t understand the usecase nor why using the random module wouldn’t accomplish your initial ask. Can you please clarify?
Yeah I understand what you mean. But my client want random from LLM prompt
I see. I still think you should have them clarify why they want that. If I was going to try to do something like this with LLMs I would try a few different prompts like “generate match most of the time and dump sometimes” and then play with the temp and do some trials to see what temp gives you about the distribution you’re looking for. In any case, I would only do something like this just to see if it’s possible, not to rely on it as a feature.
Add a reply
Sign up and join the conversation on Discord