Find answers from the community

Home
Members
shakedbuk
s
shakedbuk
Offline, last seen 2 months ago
Joined September 25, 2024
what is the custom system prompt argument in chat engine
7 comments
k
s
is there away to build a custom response synthesizer? i want to add it to speak like a salesman,
5 comments
s
W
export const llamaPipeline = async(content) =>{ let OpenEmbedllm = new OpenAIEmbedding({ temperature: 0, apiKey: process.env.OPENAI_SECRET_KEY, model:'text-embedding-3-small', apiVersion:"2023-07-01-preview", }) try{ const resFromAnalyze = await client.query(SELECT overlap,chunk_size
FROM public.${process.env.PG_DB_PREFIX}_prompts;) let pipeline = new IngestionPipeline({transformations:[ // new RemoveSpecialCharacters(), new SimpleNodeParser({ chunkOverlap:200, chunkSize:20}), new KeywordExtractor({llm:azureOpenAillm}), OpenEmbedllm ]}) const nodes = await pipeline.run({documents:[new Document({text:content})]}) console.log(nodes) return nodes }catch(err){ console.log(err) } }
4 comments
L
s
hey, is there a way to handle images in pdf files ?
2 comments
d
W
@kapa.ai how to make extractQuestionsFromNode not consider the metadata of the node
2 comments
k
is there splitter , that I can use in my llamapipeline that will know to deal with paragraphs and list of items? and not cut them in the middle?
1 comment
a
hey, I am trying to force a chatEngine to talk like salesman so I modified its systemContextPrompt and stated to him to not "invent" stuff and only use its sources, but he ignores it. does anyone has a good wording for forcing it? its using gpt-3.5-turbo
5 comments
s
s
too bad I am using ts seems like this method is not supported there
3 comments
s
W
do i have a way to see the prompts that are being sen with the chatEngine? doing getPrompts() only reveal the systemContextPrompt
5 comments
i
L
s
in our current flow we take the topk nodes that are returned and then we pass to the llm so I would rank them, is there a built in way to pass all the nodes to the llm and prompt it to rank them based on some context?
6 comments
L
s
is it included in the embedding if I use the llm to embed it? , does it just append it to the node text before embedding the chunk+ title
5 comments
L
s