Find answers from the community

Updated 2 months ago

`export const llamaPipeline = async(

export const llamaPipeline = async(content) =>{ let OpenEmbedllm = new OpenAIEmbedding({ temperature: 0, apiKey: process.env.OPENAI_SECRET_KEY, model:'text-embedding-3-small', apiVersion:"2023-07-01-preview", }) try{ const resFromAnalyze = await client.query(SELECT overlap,chunk_size
FROM public.${process.env.PG_DB_PREFIX}_prompts;) let pipeline = new IngestionPipeline({transformations:[ // new RemoveSpecialCharacters(), new SimpleNodeParser({ chunkOverlap:200, chunkSize:20}), new KeywordExtractor({llm:azureOpenAillm}), OpenEmbedllm ]}) const nodes = await pipeline.run({documents:[new Document({text:content})]}) console.log(nodes) return nodes }catch(err){ console.log(err) } }
L
s
4 comments
I think this is a bug. Should be fixed now (I think a new version got released this morning?)
hey, it still occurs 😦
it only embeds the first few nodes
can you report this on the LlamaIndexTS repo? I'm not heavily involved with that project
Add a reply
Sign up and join the conversation on Discord