Find answers from the community

Updated last year

Thread

Attachment
image.png
L
L
42 comments
I know the error (it should be llm = OpenAI(model="gpt-3.5-turbo"))

But OOC, why pydantic version do you have? pip show pydantic
pretty sure it was tested with pydantic2
but I will fix it πŸ™‚
(Thats why it wasn't noticed though)
Thank you Logan! I assume if I update to pydantic2, everything should be working
I think it should yes!
Or you can edit the installed template to have the code/fix I put above
Oh also, the llama-create app I've generated doesn't seem to be working. I give it a query and it just deletes the message entirely
that sounds sus. It worked just the other day lol
is there an error in the terminal anywhere?
no, thats the thing, its just silent
Maybe I can just run a simple CURL to the backend to see the problem or perhaps its a frontend issue
Hopefully, this video shows more context
there isn't any calls to the backend
Even the chrome console logs are empty? And the network logs?
Let me check on that
Hmm I think the console outlines CORS as an issue even though I remember seeing that yall allowed CORS by default
hmmm wait are you using firefox? Maybe try in chrome or edge πŸ€”
I think I figured it out actually
It's because im accessing it from my server perhaps
and edge reports this
ohhhh that might be why!
when I tested, I ran everything on the same machine
Weirdly though, I couldn't run create-llama on windows and thats why
classic windows. I used WSL
because the NEXT_*** probably was a linux env var
and it doesnt translate to windows
I forgot you can use WSL on VSCode on Windows
Got it working!
Attachment
image.png
I just did some reverse ssh tunneling to my desktop
Question, so this does all types of docs and not just pdfs
like text,htmls or do I need loaders for seperate types?
yea, whatever files are in the data folder it will read automatically -- it handles nearly every file type
It's just using SimpleDirectoryReader() under the hood
I encourage you to dive into the template a bit to see where the magic happens πŸ™‚ Good learning process!

One downside right now is that the index is loaded from disk on every query. Swapping in a hosted vector db would solve that, but it's also something we want to solve in the template soon πŸ’ͺ
Sweet, I imagine I can tweak that myself! Also, regarding the localhost issue, I assume I either need to expose the api as well or maybe do something with that?
yea likely expose the host/port and/or change the host/port its binding to
Add a reply
Sign up and join the conversation on Discord