Find answers from the community

Updated 4 weeks ago

Uv Is Amazing, Highly Recommended.

uv is amazing, higly recommended.
2
s
0
w
21 comments
One thing I'm going to miss about using miniconda for my virtual environments is the ability to have multiple environments per project (e.g. one for training and one for inference).
Or maybe I'm trippin.
I know that poetry made that a pain.
Why? That's still doable.
Plain Text
mkdir my-project
cd my-project
uv venv venv-gradio
uv venv venv-streamlit
source ./venv-gradio/bin/activate
If I want a different version of PyTorch for each env, do I need to have two different .toml files?
Or worse yet, do I need to edit the .toml file each time I switch envs?
Ah, I didn't think about that. Not sure.
I would use pyproject-gradio.toml and pyproject-streamlit.toml, and link the currently used to pyproject.toml upon activation of the venv. A simple shell alias.
Not sure how annoying that would be to use. ;–)
I'll find out once I have more free time. Thank you! 😊
Here's a bash/zsh function that will do that:
Plain Text
uvenv () {
    if [[ $# -ne 1 ]]; then
        echo "No venv name provided, exiting."
        kill -INT $$
    fi

    if [[ ! -d $1 ]]; then
        echo "Provided venv name '$1' does not exist, exiting."
        kill -INT $$
    fi

    if [[ ! -f "pyproject-$1.toml" ]]; then
        echo "Required file 'pyproject-$1.toml' does not exist, exiting."
        kill -INT $$
    fi

    if [[ -f $1/bin/activate ]]; then
        echo "Activating the '$1' venv."
        source "$1/bin/activate"
        ln -s "pyproject-$1.toml" pyproject.toml
    else
        "$1/bin/activate not found."
    fi
}

Put it in your .bash_profile or .zprofile.
Usage:
Plain Text
mkdir my-project
cd my-project
uv venv venv-gradio
uv venv venv-streamlit
uv init
cp pyproject.toml pyproject-gradio.toml
mv pyproject.toml pyproject-streamlit.toml
uvenv gradio
uvenv streamlit

Should work. :–)
Yeah, uv is awesome. Never going back to conda or regular python venv.
Yes you can have different pytorches, certainly by system, but there are probably other switches. https://docs.astral.sh/uv/guides/integration/pytorch/
Similar to the test that @sodeep wrote to check for outdated packages, is there an easy way to check what models are supported by a particular or all the llm libraries? It's getting hard to keep up. @Logan M
pip list --outdated | grep llama
You really only need to update if you think something is broken though or if theres some new feature, if it's working fine, I wouldn't bother
I was looking more for the models defined in the pyinit like 4o, o3 etc.
Probably need to crawl the repo and extract them
Right, that's the one, which is what I meant by needing to crawl the repo files
Add a reply
Sign up and join the conversation on Discord