Is Python's tooling incredibly difficult, or am I just stupid?
from zaphodb2002@sh.itjust.works to programming@programming.dev on 06 Nov 18:44
https://sh.itjust.works/post/27698868

So I’m no expert, but I have been a hobbyist C and Rust dev for a while now, and I’ve installed tons of programs from GitHub and whatnot that required manual compilation or other hoops to jump through, but I am constantly befuddled installing python apps. They seem to always need a very specific (often outdated) version of python, require a bunch of venv nonsense, googling gives tons of outdated info that no longer works, and generally seem incredibly not portable. As someone who doesn’t work in python, it seems more obtuse than any other language’s ecosystem. Why is it like this?

#programming

threaded - newest

Ephera@lemmy.ml on 06 Nov 18:56 next collapse

Python never had much of a central design team. People mostly just scratched their own itch, so you get lots of different tools that do only a small part each, and aren’t necessarily compatible.

kSPvhmTOlwvMd7Y7E@programming.dev on 06 Nov 19:05 next collapse

You re not stupid, python’s packaging & versionning is PITA. as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem

MajorHavoc@programming.dev on 06 Nov 20:16 collapse

as long as you write it for yourself, you re good. As soon as you want to share it, you have a problem

A perfect summary of the history of computer code!

ad_on_is@lemm.ee on 06 Nov 19:06 next collapse

This is exactly how I feel about python as well… IMHO, it’s good for some advanced stuff, where bash starts to hit its limits, but I’d never touch it otherwise

iii@mander.xyz on 06 Nov 19:08 next collapse

I agree. Python is my language of choice 80% or so of the time.

But my god, it does packaging badly! Especially if it’s dependent on linking to compiled code!

Why it is like that, I couldn’t tell. The language is older than git, so that might be part of it.

However, you’re installing python libraries from github? I very very rarely have to do that. In what context do you have to do that regularly?

ebc@lemmy.ca on 06 Nov 19:14 next collapse

I’m no Python expert either and yeah, from an outsider’s perspective it seems needlessly confusing. easy_install that’s never been easy, pip that should absolutely be put on a Performance Improvement Plan, and now this venv nonsense.

You can criticize javascript’s ridiculous dependencies all you want (left-pad?), but one thing that they absolutely got right is how to manage them. Everything’s in node_modules and that’s it. Yeah, you might get eleven copies of left-pad on your system, but you know what you NEVER get? Version conflicts between projects you’re working on.

moreeni@lemm.ee on 06 Nov 20:07 collapse

Seriously. Those are EXACTLY the thoughts I had after I was forced to deal with Python after a ton of time writing projects in JS.

ravhall@discuss.online on 06 Nov 19:20 next collapse

This isn’t the answer you want, but Go(lang) is super easy to learn and has a ton of speed on python. Yes, it’s more difficult, but once you understand it, it’s got a lot going for it.

lime@feddit.nu on 06 Nov 21:41 collapse

it’s also not at all relevant. go is great, but this is about python.

ravhall@discuss.online on 06 Nov 21:44 collapse

I’m sorry I offended you.

lime@feddit.nu on 06 Nov 21:53 collapse

this is not about offense! nobody is offended. but if you ask me for help with an apple pie and i tell you to make meatballs… it’s a confusing lack of relevance.

ravhall@discuss.online on 06 Nov 21:58 collapse

I did lead with an appropriate request for a sidebar. I just feel the rip about context was even less appropriate. And apple cobbler would be a better comparison. Apples, just different.

lime@feddit.nu on 06 Nov 23:28 collapse

it’s not though. op has issues installing programs built in python. suggesting they rebuild those programs in go is 100% an apples to meatballs comparison, and way off topic.

ravhall@discuss.online on 07 Nov 01:57 collapse

They should get those same programs, but for Go. I’m sure someone has made whatever they’re doing. It would work better.

Orygin@sh.itjust.works on 07 Nov 08:22 next collapse

You’re not wrong, but you have offended the python guys for suggesting they use something other than their toy language.
I personally look away when I find programs I want to use that are written in python. I don’t have time to play with all that BS just to run a small software on my machine. Go is my go-to (heh) but any other modern language would be fine.

lime@feddit.nu on 07 Nov 08:42 collapse

such a strange interpretation. i’ve been working in go for over 10 years now, and i love it. but the notion that you can “just find the same program but built in a different language” doesn’t make sense at all.

like, if you’re annoyed with pandoc being written in haskell and clogging up your system dependencies, you can’t just “find another pandoc”. there’s nothing like it. same thing with curl, or xonsh, or thingsboard.

Orygin@sh.itjust.works on 07 Nov 13:53 collapse

I agree in general, if you need something specific then there is no way around it. But when I’m looking for something I evaluate all possible solutions, and being written in a language that has issues like this is a mark against it. Sometimes it’s easier to write the thing myself in some language I master than to wrangle python or Js dependencies.
In my experience there is rarely only one solution written in python or Js for my use cases.

lime@feddit.nu on 07 Nov 13:58 collapse

that’s posturing if anything. if you’re an experienced developer it takes fully 10 minutes with either system. and if you’re not interested in modifying it, just use a container image.

the only case where i would agree with you is when i have to modify LD_LIBRARY_PATH to get things to run…

Orygin@sh.itjust.works on 07 Nov 14:33 collapse

Depends on what you’re used to. I have lost too much time trying to get a python or js program to run on my machine.
Of course if the project is well written and with decent documentation it’s easier, but in general I have had too many incompatibilities with versions of the tooling and the dependencies which may be too ancient to work properly. On the other side, go code that was written a decade ago still compiles fine without thinking about it.
Hell I even had a js project that was working then 6 months later, without changing any code in it, wouldn’t build. Talking to a front end dev at work he immediately said “oh yeah node was probably updated and you need to do x and y to make it work”. Sorry but I have other things to do than massaging bad tooling to build this.

Btw, even containers are not a bullet proof solution. I had a python container straight up not work even though it was distributed like that.

lime@feddit.nu on 07 Nov 14:43 collapse

i mean, that is the difference between interpreted and compiled.

if the container doesn’t work though, that means it is broken and should be fixed. the point of them is literally to be plug-n-play. that would be like distributing a go binary with a segfault in main.

lime@feddit.nu on 07 Nov 08:39 collapse

such a weird take.

Balinares@pawb.social on 06 Nov 19:20 next collapse

It… depends. There is some great tooling for Python – this was less true only a few years ago, mind you – but the landscape is very much in flux, and usage of the modern stuff is not yet widespread. And a lot of the legacy stuff has a whole host of pitfalls.

Things are broadly progressing in the right direction, and I’d say I’m cautiously optimistic, although if you have to deal with anything related to conda then for the time being: good luck, and sorry.

DarkThoughts@fedia.io on 06 Nov 19:26 next collapse

Tried to install Automatic1111 for Stable Diffusion in an Arch distrobox, and despite editing the .sh file to point to the older tarballed Python version as advised on Github, it still tells me it uses the most up to date one that's installed system wide and thus can't install pytorch. And that's pretty much where my personal knowledge ends, and apparently that of those (i.e. that one person) on Github. ¯\_(ツ)_/¯

Always funny when people urge you to ask for help but no one ends up actually helping.

tal@lemmy.today on 06 Nov 19:33 next collapse

despite editing the .sh file to point to the older tarballed Python version as advised on Github, it still tells me it uses the most up to date one that’s installed system wide and thus can’t install pytorch.

Can you paste your commands and output?

If you want, maybe on !imageai@sh.itjust.works, since I think that people seeing how to get Automatic1111 set up might help others.

I’ve set it up myself, and I don’t mind taking a stab at getting it working, especially if it might help get others over the hump to a local Automatic1111 installation.

zaphodb2002@sh.itjust.works on 07 Nov 05:19 collapse

Lol this is exactly why I made this post. I ended up using ComfyUI instead which has other, different python issues, but I got it working (kinda, no GPU but it’s fine it works)

DarkThoughts@fedia.io on 07 Nov 11:05 collapse

I definitely want gpu support. Although I struggle with that somewhat on Koboldcpp as well where I can't use ROCm, only Vulkan. Unsure where the difference is performance wise.

I'd like to try the other UIs too, but the problem is that Automatic1111 is where the majority of additional plugins can be found.

tal@lemmy.today on 06 Nov 19:31 next collapse

venv nonsense

I mean, the fact that it isn’t more end-user invisible to me is annoying, and I wish that it could also include a version of Python, but I think that venv is pretty reasonable. It handles non-systemwide library versioning in what I’d call a reasonably straightforward way. Once you know how to do it, works the same way for each Python program.

Honestly, if there were just a frontend on venv that set up any missing environment and activated the venv, I’d be fine with it.

And I don’t do much Python development, so this isn’t from a “Python awesome” standpoint.

scrion@lemmy.world on 07 Nov 00:44 collapse

pyenv and uv let you install and switch between multiple Python versions.

As for uv, those come from the Python build standalone project, if I remember correctly, pyenv also installs from there, but don’t quote me on that.

nickwitha_k@lemmy.sdf.org on 06 Nov 19:35 next collapse

Python’s packaging is not great. Pip and venvs help but, it’s lightyears behind anything you’re used to. My go-to is using a venv for everything.

solrize@lemmy.world on 06 Nov 19:45 next collapse

It’s something of a “14 competing standards” situation, but uv seems to be the nerd favourite these days.

iii@mander.xyz on 06 Nov 20:05 next collapse

I still do the python3 -m venv venv && source venv/bin/activate

How can uv help me be a better person?

GBU_28@lemm.ee on 06 Nov 21:22 next collapse

And pip install -r requirements.txt

BeardedGingerWonder@feddit.uk on 06 Nov 23:00 collapse

Fuck it, I just use sudo and live with the consequences.

flying_sheep@lemmy.ml on 07 Nov 07:25 next collapse

Oh no

Swedneck@discuss.tchncs.de on 07 Nov 12:53 next collapse

the software equivalent of leaving the dirt on your vegetables to harden your immune system

bamboo@lemm.ee on 11 Nov 17:03 collapse

You’ll see when you start your second project why this doesn’t work.

PartiallyApplied@lemmy.world on 07 Nov 07:02 next collapse

If you’re happy with your solution, that’s great!

uv combines a bunch of tools into one simple, incredibly fast interface, and keeps a lock file up to date with what’s installed in the project right now. Makes docker and collaboration easier. Its main benefit for me is that it minimizes context switching/cognitive load

Ultimately, I encourage you to use what makes sense to you tho :)

NostraDavid@programming.dev on 07 Nov 18:19 collapse

  1. let pyproject.toml track the dependencies and dev-dependencies you actually care about
  • dependencies are what you need to run your application
  • dev-dependencies are not necessary to run your app, but to develop it (formatting, linting, utilities, etc)
  1. it can track exactly what’s needed ot run the application via the uv.lock file that contains each and every lib that’s needed.
  2. uv will install the needed Python version for you, completely separate from what your system is running.
  3. uv sync and uv run <application> is pretty much all you need to get going
  4. it’s blazingly fast in everything
iii@mander.xyz on 07 Nov 19:07 collapse

Thank you for explaining so clearly. Point 3 is indeed something I’ve ran into before!

QuazarOmega@lemy.lol on 06 Nov 20:28 next collapse

This! Haven’t used that one personally, but seeing how good ruff is I bet it’s darn amazing, next best thing that I used has been PDM and Poetry, because Python’s first party tooling has always been lackluster, no cohesive way to define a project and actually work it until relatively recently

scrion@lemmy.world on 07 Nov 00:33 next collapse

I moved all our projects (and devs) from poetry to uv. Reasons were poetry’s non standard pyproject.toml syntax and speed, plus some weird quirks, e. g. if poetry asks for input and is not run with the verbose flag, devs often don’t notice and believe it is stuck (even though it’s in the default project README).

Personally, I update uv on my local machine as soon as a new release is available so I can track any breaking changes. Couple of months in, I can say there were some hiccups in the beginning, but currently, it’s smooth sailing, and the speed gain really affects productivity as well, mostly due to being able to not break away from a mental “flow” state while staring at updates, becoming suspicious something might be wrong. Don’t get me wrong, apart from the custom syntax (poetry partially predates the pyproject PEP), poetry worked great for us for years, but uv feels nicer.

Recently, “uv build” was introduced, which simplified things. I wish there was an command to update the lock file while also updating the dependency specs in the project file. I ran some command today and by accident discovered that custom dependency groups (apart from e. g. “dev”) have made it to uv, too.

“uv pip” does some things differently, in particular when resolving packages (it’s possible to switch to pip’s behavior now), but I do agree with the decisions, in particular the changes to prevent “dependency confusion” attacks.

As for the original question: Python really has a bit of a history of project management and build tools, I do feel however that the community and maintainers are finally getting somewhere.

cargo is a bit of an “unfair” comparison since its development happened much more aligned with Rust and its whole ecosystem and not as an afterthought by third party developers, but I agree: cargo is definitely a great benchmark how project and dependency management plus building should look like, along with rustup, it really makes the developer experience quite pleasant.

The need for virtual environments exists so that different projects can use different versions of dependencies and those dependencies can be installed in a project specific location vs a global, system location. Since Python is interpreted, these dependencies need to stick around for the lifetime of the program so they can be imported at runtime. poetry managed those in a separate folder in e. g. the user’s cache directory, whereas uv for example stores the virtual environment in the project folder, which I strongly prefer.

cargo will download the matching dependencies (along with doing some caching) and link the correct version to the project, so a conceptual virtual environment doesn’t need to exist for Rust. By default, rust links everything apart from the C runtime statically, so the dependencies are no longer neesed after the build - except you probably want to rebuild the project later, so there is some caching.

Finally, I’d also recommend to go and try setting up a project using astral’s uv. It handles sane pyproject.toml files, will create/initialize new projects from a template, manages virtual environments and has CLI to build e. g. wheels or source distribution (you will need to specify which build backend to use. I use hatchling), but thats just a decision you make and express as one line in the project file. Note: hatchling is the build backend, hatch is pypa’s project management, pretty much an alternative to poetry or uv.

uv will also install complete Python distributions (e. g. Python 3.12) if you need a different interpreter version for compatibility reasons

If you use workspaces in cargo, uv also does those.

uv init, uv add, uv lock --upgrade, uv sync, uv build and how uv handles tools you might want to install and run should really go a long way and probably provide an experience somewhat similar to cargo.

QuazarOmega@lemy.lol on 07 Nov 09:27 collapse

I think you responded to the wrong comment, I didn’t question the need for uv or other tools like that

scrion@lemmy.world on 07 Nov 19:26 collapse

I did that on purpose, i. e. I wanted to confirm your thoughts about uv, drifted off into a general rant, remembered OP’s original question and later realized it would have been better framed as a top level comment. In my defense, I was in an altered state of mind at the time.

QuazarOmega@lemy.lol on 08 Nov 08:21 collapse

Fair lol, it was welcome anyway

NostraDavid@programming.dev on 07 Nov 18:24 collapse

I bet it’s darn amazing,

It is. In this older article (by Anna-Lena Popkes) uv is still not in the middle, but I would claim it’s the new King of Project Management, when it comes to Python.

uv init --name <some name> --package --app and you’re off to the races.

Are you cloning a repo that’s uv-enabled? Just uv sync and you’re done!

Heck, you can now add dependencies to a script and just uv run --script script.py (IIRC) and you don’t need to install anything - uv will take care of it all, including a needed Python version.

Only downside is that it’s not 1.0 yet, so the API can change at any update. That is the last hurdle for me.

priapus@sh.itjust.works on 06 Nov 20:06 next collapse

Yeah the tooling sucks. The only tooling I’ve liked is Poetry, I never have trouble installing or packaging the apps that use it.

Ephera@lemmy.ml on 07 Nov 04:27 next collapse

Personally, I’ve found Poetry somewhat painful for developing medium-sized or larger applications (which I guess Python really isn’t made for to begin with, but yeah).

Big problem is that its dependency resolution is probably a magnitude slower than it should be. Anytime we changed something about the dependencies, you’d wait for more than a minute on its verdict. Which is particularly painful, when you have to resolve version conflicts.

Other big pain point is that it doesn’t support workspaces or multi-project builds or whatever you want to call them, so where you can have multiple related applications or libraries in the same repo and directly depending on each other, without needing to publish a version of the libraries each time you make a change.

When we started our last big Python project, none of the Python tooling supported workspaces out of the box. Now, there’s Rye, which does so. But yeah, I don’t have experience yet, with how well it works.

NostraDavid@programming.dev on 07 Nov 18:31 collapse

Downside: “^1.2.3” as default versioning for libraries. You just pinned a version? Oh great, now I can’t upgrade another library because you had to pin something in yours…

That non-standard syntax has been a PITA for the last few years. That being said: They created that syntax for regular applications (and not for libs) in a time when the pyproject.toml syntax was not anywhere near finalization.

it_depends_man@lemmy.world on 06 Nov 20:06 next collapse

The difficulty with python tooling is that you have to learn which tools you can and should completely ignore.

Unless you are a 100x engineer managing 500 projects with conflicting versions, build systems, docker, websites, and AAAH…

  • you don’t really need venvs
  • you should not use more than on package manager (I recommend pip) and you should cling to it with all your might and never switch. Mixing e.g. conda, on linux system installers like apt, is the problem. Just using one is fine.
  • You don’t “need” need any other tools. They are bonuses that you should use and learn how to use, exactly when you need them and not before. (type hinting checker, linting, testing, etc…)

Why is it like this?

Isolation for reliability, because it costs the businesses real $$$ when stuff goes down.

venvs exists to prevent the case that “project 1” and “project 2” use the same library “foobar”. Except, “project 1” is old, the maintainer is held up and can’t update as fast and “project 2” is a cutting edge start up that always uses the newest tech.

When python imports a library it would use “the libary” that is installed. If project 2 uses foobar version 15.9 which changed functionality, and project 1 uses foobar uses version 1.0, you get a bug, always, in either project 1 or project 2. Venvs solve this by providing project specific sets of libraries and interpreters.

In practice for many if not most users, this is meaningless, because if you’re making e.g. a plot with matplotlib, that won’t change. But people have “best practices” so they just do stuff even if they don’t need it.

It is a tradeoff between being fine with breakage and fixing it when it occurs and not being fine with breakage. The two approaches won’t mix.

very specific (often outdated) version of python,

They are giving you the version that they know worked. Often you can just remove the specific version pinning and it will work fine, because again, it doesn’t actually change that much. But still, the project that’s online was the working state.

ebc@lemmy.ca on 06 Nov 20:18 collapse

Coming at this from the JS world… Why the heck would 2 projects share the same library? Seems like a pretty stupid idea that opens you up to a ton of issues, so what, you can save 200kb on you hard drive?

jacksilver@lemmy.world on 06 Nov 20:31 next collapse

Yeah, not sure I would listen to this guy. Setting up a venv for each project is about a bare minimum for all the teams I’ve worked on.

That being said python env can be GBs in size (especially when doing data science).

NostraDavid@programming.dev on 07 Nov 18:40 collapse

especially when doing data science

500MB for Ray, another 500MB for Polars (though that was a bug IIRC), a few more megs for whatever binaries to read out those weird weather files (NetCDF and Grib2).

it_depends_man@lemmy.world on 06 Nov 20:37 collapse

Why the heck would 2 projects share the same library?

Coming from the olden days, with good package management, infrequent updates and the idea that you wanted to indeed save that x number of bytes on the disk and in memory, only installing one was the way to go.

Python also wasn’t exactly a high brow academic effort to brain storm the next big thing, it was built to be a simple tool and that included just fetching some library from your system was good enough. It only ended up being popular because it is very easy to get your feet wet and do something quick.

atzanteol@sh.itjust.works on 06 Nov 20:15 next collapse

With all the hype surrounding Python it’s easy to forget that it’s a really old language. And, in my opinion, the leadership is a bit of a mess so there hasn’t been any concerted effort on standardizing tooling.

Some unsolicited advice from somebody who is used more refined build environments but is doing a lot of Python these days:

The whole venv thing isn’t too bad once you get the hang of it. But be prepared for people to tell you that you’re using the wrong venv for reasons you’ll never quit understand or likely need to care about. Just use the bundled “python -m venv venv” and you’ll be fine despite other “better” alternatives. It’s bundled so it’s always available to you. And feel free to just drop/recreate your venv whenever you like or need. They’re ephemeral and pretty large once you’ve installed a lot of things.

Use “pipx” to install python applications you want to use as programs rather than libraries. It creates and manages venvs for them so you don’t get library conflicts. Something like “pip-tools” for example (pipx install pip-tools).

Use “pyenv” to manage installed python versions - it’s a bit like “sdkman” for the JVM ecosystem and makes it easy to deal with the “specific versions of python” stuff.

For dependencies for an app - I just create a requirements.txt and “pip install -r requirements.txt” for the most part… Though I should use one of the 80 better ways to do it because they can help with updating versions automatically. Those tools mostly also just spit out a requirements.txt in the end so it’s pretty easy to migrate to them. pip-tools is what my team is moving towards and it seems a reasonable option. YMMV.

taaz@biglemmowski.win on 07 Nov 18:23 collapse

This.

venv
pip-tools

Specify your primary dependencies in pyproject.toml and use pip-compile to keep stuff locked in requirements.txt to exact versions (or even hashes).
Though after working with cargo a bit, I would love to have all of this in a first-class program, hope uv can get there.

lime@feddit.nu on 06 Nov 21:50 next collapse

everyone focuses on the tooling, not many are focusing on the reason: python is extremely dynamic. like, magic dynamic you can modify a module halfway through an import, you can replace class attributes and automatically propagate to instances, you can decompile the bytecode while it’s running.

combine this with the fact that it’s installed by default and used basically everywhere and you get an environment that needs to be carefully managed for the sake of the system.

js has this packaging system down pat, but it has the advantage that it got mainstream in a sandboxed isolated environment before it started leaking out into the system. python was in there from the beginning, and every change breaks someone’s workflow.

the closest language to look at for packaging is probably lua, which has similar issues. however since lua is usually not a standalone application platform it’s not a big deal there.

tyler@programming.dev on 08 Nov 20:45 collapse

and yet that all works fine in Ruby, which came out around the same time as Python and yet has had Bundler for 15 years now.

Python - 15+ package managers and build tools Ruby - 1

the closest language to look at for packaging is probably lua, which has similar issues. however since lua is usually not a standalone application platform it’s not a big deal there.

no the closest language is literally Ruby, it’s almost the exact same language, except the tooling isn’t insane and it came out only a few years after python.

lime@feddit.nu on 09 Nov 01:20 collapse

good point, ruby is a good comparison. although, ruby is very different under the hood. it’s magically dynamic in a completely different way, and it also never really got the penetration on the system level that python did.

none of this is to take away from the fact that python packaging is bad. i know how to work it because i’ve been programming in python for 14 years, but trying to teach people makes the problem obvious. and yet.

onlinepersona@programming.dev on 06 Nov 21:59 next collapse

Difficult? How so? I find compiling C and C++ stuff much more difficult than anything python. It never works on the first try whereas with python the chances are much much higher.

What’s is so difficult to understand about virtual envs? You have global python packages, you can also have per user python packages, and you can create virtual environments to install packages into. Why do people struggle to understand this?

The global packages are found thanks to default locations, which can be overridden with environment variables. Virtual environments set those environment variables to be able to point to different locations.

python -m venv .venv/ means python will execute the module venv and tell it to create a virtual environment in the .venv folder in the current directory. As mentioned above, the environment variables have to be set to actually use it. That’s when source .venv/bin/activate comes into play (there are other scripts for zsh and fish). Now you can run pip install $package and then run the package’s command if it has one.

It’s that simple. If you want to, you can make it difficult by doing sudo pip install $package and fucking up your global packages by possibly updating a dependency of another package - just like the equivalent of updating glibc from 1.2 to 1.3 and breaking every application depending on 1.2 because glibc doesn’t fucking follow goddamn semver.

As for old versions of python, bro give me a break. There’s pyenv for that if whatever old ass package you’re installing depends on an ancient 10 year old python version. You really think building a C++ package from 10 years ago will work more smoothly than python? Have fun tracking down all the unlocked dependency versions that “Worked On My Machine 🏧” at the start of the century.

The only python packages I have installing are those with C/C++ dependencies which have to be compiled at install time.

Y’all have got to be meme’ing.

Anti Commercial-AI license

tyler@programming.dev on 08 Nov 20:56 collapse

I think you have got to be meme’ing. You literally wrote 7 paragraphs about how to build something for python when for other languages it’s literally a single command. For Ruby, it’s literally bundle. Nothing else. Doesn’t matter if it’s got C packages or not. Doesn’t matter if it’s windows or not. Doesn’t matter if you have a different project one folder over that uses an older gem or not. Doesn’t matter if it’s 15 years old or not. One command.

Just for comparison for gradle it’s ./gradlew build For maven is mvn install For Elixir it’s mix deps.get mix compile For node it’s npm install

every other language it’s hardly more than 1 command.

Python is the only language that thinks that it’s even slightly acceptable to have virtual environments when it was universally decided upon decades ago to be a tremendously bad idea. Just like node_modules which also was known to be a bad idea before npm decided to try it out again, only for it to be proven to be a bad idea right off the bat. And all the other python build tools have agreed that virtual envs are bad.

Rogue@feddit.uk on 07 Nov 00:51 next collapse

Docker might be solution here.

But from my experience most python scripts are absolute junk. The barrier for entry is low so there’s a massive disparity in quality.

magic_lobster_party@fedia.io on 07 Nov 16:53 collapse

Python is truly a mess when Docker is considered a solution.

antlion@lemmy.dbzer0.com on 07 Nov 02:31 next collapse

Python is hacky, because it hacks. There’s a bunch of ways you can do anything. You can run it on numerous platforms, or even on web assembly. It’s not maintained centrally. Each “app” you find is just somebodies hack project they’re sharing with you for fun.

bhamlin@lemmy.world on 07 Nov 02:58 collapse

Python is the new Perl

AnUnusualRelic@lemmy.world on 07 Nov 11:15 next collapse

After using python, I’m of the opinion that perl was much cleaner.

bhamlin@lemmy.world on 07 Nov 12:19 next collapse

Yes. Its line noise was of a much higher quality. 😉

magic_lobster_party@fedia.io on 07 Nov 16:52 collapse

Nothing comes close to Perl’s abuse of global variables. Oh you called this function? Take a guess which global variables it will use.

Zykino@programming.dev on 07 Nov 12:54 collapse

On that note, I’m hesitant between writing my scripts in perl or python right now. Bash prevent sharing with Windows peoples… I just want to provide easy wrappers tools that are usually aroud 10 lines of shell, but testers ain’t on linux so they cannot use them.

I don’t know perl, but each time I interract with pyton’s projects I have a different venv/poetry/… to setup. Forget adout it the next time and nothing is kept easy to reuse.

bhamlin@lemmy.world on 07 Nov 13:59 collapse

Perl isn’t really any better. There aren’t easy tools that do the same thing as venv. They exist, but they are not easy. Plus there are a much larger amount of cpan modules that have c in them than python.

WolfLink@sh.itjust.works on 07 Nov 03:05 next collapse

The reason you do stuff in a venv is to isolate that environment from other python projects on your system, so one Python project doesn’t break another. I use Docker for similar reasons for a lot of non-Python projects.

A lot of Python projects involve specific versions of libraries, because things break. I’ve had similar issues with non-Python projects. I’m not sure I’d say Python is particularly worse about it.

There are tools in place that can make the sharing of Python projects incredibly easy and portable and consistent, but I only ever see the best maintained projects using them unfortunately.

Jocker@sh.itjust.works on 07 Nov 04:47 next collapse

I’ve started using poetry and the experience has improved.

Die4Ever@programming.dev on 07 Nov 05:12 next collapse

I’m not sure this can be really fixed with Python 3, maybe we just have to hope for Python 4

flying_sheep@lemmy.ml on 07 Nov 07:24 collapse

It’s fixed, and the python version had nothing to do with it. Just use hatch

Tja@programming.dev on 07 Nov 07:33 collapse

Ah yes, the 15th standard we’ve been waiting for!

flying_sheep@lemmy.ml on 07 Nov 07:41 collapse

It’s not a standard, it’s built on standards.

You can also use Poetry (which recently grew standard metadata support) or plain uv venv if you want to do things manually but fast.

Zykino@programming.dev on 07 Nov 20:10 collapse

Just use this one… or any of this 4 others.

This is the issue for us, python outsiders. Each time we try we get a different answer with new tools. We are outside of the comtunity, we don’t know the trend, old and new, pro and cons.

Your first recommandation is hatch… first time I’ve heard of it. Uv seems trendy in this thread, but before that it was unknown to me too.

As I understands it, it should be pip’s job. When it detect I’m in a project it install packages in it and python use them. It can use any tool under the hood, but the default package manager shoud be able to do it on its own.

flying_sheep@lemmy.ml on 07 Nov 23:02 collapse

Uv and pip do the same thing, uv is just faster.

Hatch has the same role as Poetry or tox: managing environments for you.

Applications should be packaged properly, in a self contained installer for exactly this demographic. It’s not Python’s fault that this isn’t common practice.

vin@lemmynsfw.com on 07 Nov 06:11 next collapse

Yep, they are not portable, every app should come bundled with its own interpreter. As to why, I think historically it didn’t target production grade application development.

moonpiedumplings@programming.dev on 07 Nov 06:37 next collapse

How to improve Python packaging, or why fourteen tools are at least twelve too many

Still relevant.

flying_sheep@lemmy.ml on 07 Nov 07:23 next collapse

No it’s not. E.g. nobody who starts a new project uses setup.py anymore

moonpiedumplings@programming.dev on 07 Nov 08:30 next collapse

OP seems to be trying to install older projects, rather than creating a new project.

Kissaki@programming.dev on 07 Nov 17:52 collapse

Are you sure? I’m not very active in that ecosystem, but if that was prevalent in the past, surely there’s still tutorials and stuff out there that people would follow and create such projects even today?

More than that, it seems to me that the official python docs for packaging [still] talks about setup.py. Why would people not use that?

flying_sheep@lemmy.ml on 07 Nov 22:59 collapse

Sure, there was some hyperbole. Some people need some specific setuptools plugin or something. Almost nobody.

snowe@programming.dev on 08 Nov 20:39 collapse

when the official docs are telling you to use it, then it’s used. You can have no expectation of people to think the tooling isn’t shit when it’s literally the official recommendation.

flying_sheep@lemmy.ml on 09 Nov 08:31 collapse

It doesn’t. read the first words behind the link you posted:

Page Status: Outdated

Here is the actual one: packaging.python.org/en/…/packaging-projects/

JackbyDev@programming.dev on 07 Nov 15:59 collapse

<img alt="python" src="https://imgs.xkcd.com/comics/python_environment.png">

xkcd.com/1987/

vext01@lemmy.sdf.org on 07 Nov 07:30 next collapse

The venv stuff is pretty annoying, I agree.

Lettuceeatlettuce@lemmy.ml on 07 Nov 09:36 collapse

As a baby Python Dev, I’m glad it’s not just me.

flubba86@lemmy.world on 07 Nov 12:37 collapse

I’ve been full time writing python professionally since 2015. You get used to it. It starts to just make sense and feel normal. Then when you move to a different language environment you wonder why their tooling doesn’t use a virtualenv.

Lettuceeatlettuce@lemmy.ml on 07 Nov 22:15 collapse

I’m starting to get the hang of it. I was using Debian, so I had to figure out the basics of venv because many of the frameworks I was trying to learn require newer versions of Python than what comes with Debian.

vscodium works really easily inside it though, so it wasn’t too bad, but I still feel like I’m treading water a little bit.

FizzyOrange@programming.dev on 07 Nov 07:54 next collapse

Yes it’s terrible. The only hope on the horizon is uv. It’s significantly better than all the other tooling (Poetry, pip, pipenv, etc.) so I think it has a good chance of reducing the options to just Pip or uv at least.

But I fully expect the Python Devs to ignore it, and maybe even make life deliberately difficult for it like they did for static analysers. They have some strange priorities sometimes.

flubba86@lemmy.world on 07 Nov 12:34 next collapse

I like the idea of uv, but I hate the name. Libuv is already a very popular C library, and used in everything from NodeJS to Julia to Python (through the popular uvloop module). Every time I see someone mention uv I get confused and think they’re talking about uvloop until I remember the Astral project, and then reconfirm to myself how much I disapprove of their name choice.

FizzyOrange@programming.dev on 07 Nov 22:28 collapse

I don’t think libuv is really that popular, nor is it that confusing.

But I do agree it’s not a very good name. “Rye” is a much better name. Probably too late anyway.

scrawdaddy@lemmy.world on 07 Nov 18:55 next collapse

UV is a game changer for python.

I hated the tooling until I found it.

tempest@lemmy.ca on 08 Nov 00:38 collapse

uv is good but it needs a little more time in the oven.

For the moment I would definitely recommend poetry if you are not a library developer. Poetry’s biggest sin is it’s atrocious performance but it has most of the features you need to work with Python apps today.

FizzyOrange@programming.dev on 08 Nov 07:15 collapse

Why do you say it needs more time in the oven? I’ve had zero issues with it as a drop-in replacement for Pip in a large commercial project, which is an extremely impressive achievement. (And it was 10x faster.)

I tried Poetry once and it failed to resolve dependencies on the first thing I tried it on. If anything Poetry needs more time in the oven. It also wasn’t 10x faster.

N0x0n@lemmy.ml on 07 Nov 11:55 next collapse

Just out of curiosity, I haven’t seen anyone recommend miniconda… Why so, is there something wrong I’m not aware of?

I’m no expert, but I totally feel you, python packages, dependencies and version matching is a real nightmare. Even with venv I had a hard time to make everything work flawlessly, especially on MacOS.

However, with miniconda everything was way easier to configure and worked as expected.

JackbyDev@programming.dev on 07 Nov 16:00 collapse

Isn’t conda specifically for mathy things?

N0x0n@lemmy.ml on 14 Nov 09:57 collapse

I haven’t heard of Mathy, but it seems to be a math tool?

From what I gathered, miniconda is like pipx or venv. It’s able to create python virtual environments.

But I’m very new to all of this so I’m not really a good source. However after experimenting with either of them (venv, pip or miniconda) I found miniconda the easiest to use, but that’s also probably a skill issue.

I was genuinely asking because their could be something I wasn’t aware of because yeah I’m new to all of this. (proprietary, bugs, not the right tool…

You seem related to programming, maybe you could give me some pointers here?

JackbyDev@programming.dev on 14 Nov 13:49 collapse

By mathy I mean related to math

nucleative@lemmy.world on 07 Nov 11:59 next collapse

Python developer here. Venv is good, venv is life. Every single project I create starts with

python3 -m venv venv

source venv/bin/activate

pip3 install {everything I need}

pip3 freeze > requirements.txt

Now write code!

Don’t forget to update your requirements.txt using pip3 freeze again anytime you add a new library with pip.

If you installed a lot of packages before starting to develop with virtual environments, some libraries will be in your OS python install and won’t be reflected in pip freeze and won’t get into your venv. This is the root of all evil. First of all, don’t do that. Second, you can force libraries to install into your venv despite them also being in your system by installing like so:

pip3 install --ignore-installed mypackage

If you don’t change between Linux and windows most libraries will just work between systems, but if you have problems on another system, just recreate the whole venv structure

rm -rf venv (…make a new venv, activate it) pip3 install -r requirements.txt

Once you get the hang of this you can make Python behave without a lot of hassle.

This is a case where a strength can also be a weakness.

azthec@feddit.nl on 07 Nov 15:22 next collapse

This is the way

Derp@lemmy.ml on 07 Nov 17:54 next collapse

This is the way.

GetOffMyLan@programming.dev on 11 Nov 11:38 collapse

It’s a stupid way

JackbyDev@programming.dev on 07 Nov 15:56 next collapse

Okay, now give me those steps but what to do if I clone an already existing repo please

megaman@discuss.tchncs.de on 07 Nov 16:25 collapse

The git repo should ignore the venv folder, so when you clone you then create a new one and activate it with those steps.

Then when you are installing requirements with pip, the repo you cloned will likely have a requirements.txt file in it, so you ‘pip install -r requirements.txt’

NostraDavid@programming.dev on 07 Nov 18:09 next collapse

pip3 freeze > requirements.txt

I hate this. Because now I have a list of your dependencies, but also the dependencies of the dependencies, and I now have regular dependencies and dev-dependencies mixed up. If I’m new to Python I would have NO idea which libraries would be the important ones because it’s a jumbled mess.

I’ve come to love uv (coming from poetry, coming from pip with a requirements/base.txt and requirements/dev.txt - gotta keep regular dependencies and dev-dependencies separate).

uv sync

uv run <application>

That’s it. I don’t even need to install a compatible Python version, as uv takes care of that for me. It’ll automatically create a local .venv/, and it’s blazingly fast.

nucleative@lemmy.world on 07 Nov 23:27 collapse

I’ve never really spent much time with uv, I’ll give it a try. It seems like it takes a few steps out of the process and some guesswork too.

oldfart@lemm.ee on 07 Nov 19:18 next collapse

OP sounds like a victim of Python 3, finding various Python 2 projects on the internet, a venv isn’t going to help

tyler@programming.dev on 08 Nov 20:37 collapse

You have been in lala land for too long. That list of things to do is insane. Venv is possibly one of the worst solutions around, but many Python devs are incapable of seeing how bad it is. Just for comparison, so you can understand, in Ruby literally everything you did is covered by one command bundle. On every system.

pixelscript@lemm.ee on 07 Nov 15:47 next collapse

Python is the only programming language that has forced me to question what the difference is between an egg and a wheel.

JackbyDev@programming.dev on 07 Nov 15:54 collapse

No, it’s not just you, Python’s tooling is a mess. It’s not necessarily anyone’s fault, but there are a ton of options and a lot of very similarly named things that accomplish different (but sometimes similar) tasks. (pyenv, venv, and virtualenv come to mind.) As someone who considers themselves between beginner and intermediate proficiency in Python, this is my biggest hurdle right now.

NostraDavid@programming.dev on 07 Nov 17:58 collapse

Python’s tooling is a mess.

Not only that. It’s a historic mess. Over the years, growing a better and better toolset left a lot of projects in a very messy state. So many answers on Stack Overflow that mention easy_install - I still don’t know what it is, but I guess it was some kind of proto uv.

JackbyDev@programming.dev on 07 Nov 19:08 collapse

Every time I’m doing anything with Python I ask myself if Java’s tooling is this complicated or I’m just used to it by now. I think a big part of the weirdness is that a lot of Python tooling is tied to the Python installation whereas in Java things like Maven and Gradle are separate. In addition, I think dependencies you install get tied to that Python installation, while in Java they just are in a cache for Maven/Gradle. And in the horrible scenario where you need to use different versions of Maven/Gradle (one place I was at specifically needed Maven 3.0.3 for one project and a different for a different, don’t ask, it’s dumb and their own fault for setting it up that way) at least they still have one common cache for everything.

I guess it also helps that with Java you (often) don’t need platform specific jar files. But Python is often used as an easy and dynamic scripting interface over more performant, native code. So you don’t really run into things like “this artifact doesn’t have a 64 bit arm version for python 2” often with Java. But that’s not a fault of Python’s tooling, it’s just the reality of how it’s used.