Ranking programming languages by energy efficiency (2021)
(www.sciencedirect.com)
from pylapp@programming.dev to programming@programming.dev on 29 Jul 09:19
https://programming.dev/post/34722777
from pylapp@programming.dev to programming@programming.dev on 29 Jul 09:19
https://programming.dev/post/34722777
A bit old but still interesting
threaded - newest
tldr:
opinionated tldr:
Their conclusion in section 3.1:
That said, they really seem to be pointing out that there are exceptions to this rule, where the languages that solve a problem most quickly are not the ones that use the least energy doing it. If you step back from the detail of individual algorithms, speed and energy efficiency do seem broadly to vary together, as one would expect.
If your target audience says too lazy didn’t read - I think the bit that applies like a rule of thumb to most cases is more relevant and has a higher practical knowledge value than the intricate details or an “it depends”.
(Similar how you can just explain gravity with newton instead of einstein, to make it short, even though it is less precise or technically false)
That was a fascinating discovery. It seems Pascal and Fortran in particular fit into the “faster but less efficient energy-wise” category. I wonder what’s going on there.
What they do have in common is that they are both O.G. languages.
Java has been around a really long time and I was still surprised how well it did.
I am shocked Fortran didn’t do better. I don’t code in Fortran. I assumed languages closer machine would do well.
Rosetta Code Global Ranking Position Language
1 C
2 Pascal
3 Ada
4 Rust
5 C++, Fortran
6 Chapel
7 OCaml, Go
8 Lisp
9 Haskell, JavaScript
10 Java
11 PHP
12 Lua, Ruby
13 Perl
14 Dart, Racket, Erlang
15 Python
When you add a dot after the number it becomes a numbered lists and you don’t have use a paragraph for each line.
Alternatively, you can use a backslash (
) at the end of a line to use a line-break so you can but one line after the other instead of requiring paragraphs.
\
) or two spaces (Ah, thanks, my formatting skills are quite limited.
Honestly surprised C# isn’t on here? It’s still one of the “big 5” languages, and .Net touts it’s incredible performance on the regular.
Here’s their conclusion ranking.
<img alt="" src="https://lemmy.world/pictrs/image/799d2d19-f448-4418-9574-9d085814e62e.png">
Python should be even further down - this list doesn’t account for the fact that you have to rewrite everything because “That’s not pythonic”.
Perl should be higher up because it let’s you just do the things with some weird smileys without jumping through hoops.
Thats not really their conclusion. Thats the Rosetta Code ranking. Table 4 is the one that has summarized results of many tests and includes a lot more languages.
Normalized global results for Energy, Time, and Memory.
.
I find this paper false/misleading. They just translated one algorithm in many languages, without using the language constructs or specificities to make the algorithm decent performant wise.
Also it doesn’t mean anything, as you aren’t just running your code. You are compiling/transpiling it, testing it, deploying it… and all those operations consume even more energy.
I’d argue that C/C++ projects use the most energy in term of testing due to the quantity of bugs it can present, and the amount of CPU time needed just to compile your 10-20k lines program. Just my 2 cents
The amount of CPU time compiling code is usually negligible compared to CPU time at runtime. Your comparison only really works if you are comparing against something like Rust, where less bugs are introduced due to certain guarantees by the language.
Regarding “language constructs” it really depends on what you mean. For example using numpy in python is kind of cheating because numpy is implemented in C. However using something like the algorithm libraries in Rust woulf be considered fair game since they are likely written in Rust itself.
They presented their methodology in an open and clear way and provide their data for everyone to interpret. You can disagree with conclusions but it’s pretty harsh to say it’s “misleading” simply because you don’t like the results.
They used two datasets, if you read the paper… It wasn’t “one algorithm” it was several from publicly available implementations of those algorithms. They chose an “optimized” set of algorithms from “The Computer Language Benchmarks Game” to produce results for well-optimized code in each language. They then used implementations of various algorithms from Rosetta Code which contained more… typical implementations that don’t have a heavy focus on performance.
In fact - using “typical language constructs or specificities” hurt the Java language implementations since List is slower than using arrays. It performed much better (surprisingly well actually) in the optimized tests than in the Rosetta Code tests.
Honestly that’s all you need to know to throw this paper away.
Why?
It’s a very heavily gamed benchmark. The most frequent issues I’ve seen are:
They’ve finally started labelling stupid submissions with “contentious” labels at least, but not when this study was done.
They provide the specific implementations used here: github.com/greensoftwarelab/Energy-Languages
I dislike the “I thought of something that may be an issue therefore just dismiss all of the work without thinking” approach.
I agree, but if you take away the hard numbers from this (which you should) all you’re left with is what we all already knew from experience: fast languages are more energy efficient, C, Rust, Go, Java etc. are fast; Python, Ruby etc. are super slow.
It doesn’t add anything at all.
Well… No. You’re reading the title. Read the document.
“We all know” is the gateway to ignorance. You need to test common knowledge to see if it’s really true. Just assuming it is isn’t knowledge, it’s guessing.
Second - it’s not always true:
Thirdly - they also did testing of memory usage to see if it was involved in energy usage.
Just build more solar
Does anyone understand the Pearson coefficient part enough to explain it? I don’t really understand why they’re measuring correlation between memory, energy, and time in that way/ how you’d interpret it.
Stop linking this, please! Any benchmark where Typescript and JavaScript are different is trash.
Why? Typescript transpiles into js but maybe the transpiler intruduces some overhead, isn’t it possible? I am not familiar with these languages, so this is an honest question.
It’s barely transpiled. There are a couple of features that involve actual code generation - enums and namespaces (which are almost never used), but the vast majority of it is just stripping the type annotations so the performance will be 100% identical.
It’s like having “Python” and “Python with type hints” as separate languages and claiming there is a big speed difference between them.
I was originally going to comment that differences in performance between JS and TS appear to be most significant in one challenge
ascii table about that, not screen reader friendly
___________________________________________________________________________________________________________________ |lang|_binary_trees___________|_fannkuch-redux_____________|_fasta_____________________|_normalized_________________| | JS | 312.14 : 21.349 : 916 | 413.90 : 33.663 : 26 | 64.84 : 5.098 : 30 | 4.45 : 6.52 : 4.59 | |_TS_|_315.10_:_21.686_:__915_|__6 898.48_:___516.541_:_26_|__82.72_:__6.909_:_____271_|____21.50_:___46.20_:__4.69_| |diff|___+.9%_:__+1.6%_:_+.1%_|_+15_66.7%_:_+14_34.4%_:_0%_|_+21.6%_:_+35.5%_:_+803.3%_|_+3_83.1%_:_+6_08.5_:_+8.1%_|
however, trying to look to see the typescript code, wanting to translate that to type hinted python code, I’ve been unable to find typescript on CLBG, so I’m confused on how they got that data,
They used to have Typescript. Looks like they removed it at some point.
Very impressive study. I was surprised to learn that Ruby was categorized as functional but not object oriented. I thought that was their whole shstick.