What is the real world benefit we will get in return?
In the rare case where I need to max out more than one CPU core, I usually implement that by having the OS run multiple instances of my program and put a bit of parallelization logic into the program itself. Like in the mandelbrot example the author gives, I would simply tell each instance of the program which part of the image it will calculate.
I wonder if that’s something they could automate? I’m sure there are some weird risks with that. Maybe a small program ends up eating all your memory in some edge case?
I spent all day not knowing whether "up the hill" meant they shipped or didn't ship. So they shipped, right? Or they shipped a JIT but removed the GIL?
The JIT does not seem to help much. All in all a very disappointing release that may be a reflection of the social and corporate issues in CPython.
A couple of people have discovered that they can milk CPython by promising features, silencing those who are not 100% enthusiastic and then underdeliver. Marketing takes care of the rest.
https://discuss.python.org/t/incremental-gc-and-pushing-back...
(Write single threaded code and have a compiler create multithreaded code)
https://en.m.wikipedia.org/wiki/Automatic_parallelization_to...
> What happens if multiple threads try to access / edit the same object at the same time? Imagine one thread is trying to add to a dict while another is trying to read from it. There are two options here
Why not just ignore this fact, like C and C++? Worst case this is a datarace, best case the programmer either puts the lock or writes a thread safe dict themselves? What am I missing here?
You’d think certain patterns could be probably safe and the interpreter could take the initiative.
Is there a term for this concept?
Yeah right..
There’s probably a whole generation of programmers (if not two) who don’t know the feeling of shooting yourself in the foot with multithreading. You spend a month on a prototype, then some more to hack it all together for semi-real world situations, polish the edges, etc. And then it falls flat day 1 due to unexpected races. Not a bad thing on itself, transferrable experience is always valuable. And don’t worry, this one is. Enough ecos where it’s not “difficult to share data”.
Python is absolutely the worst language to work in with respect to code formatters. In any other language I can write my code, pressing enter or skipping enter however I want, and then the auto formatter just fixes it and makes it look like normal code. But in python, a forgotten space or an extra space, and it just gives up.
It wouldn't even take much, just add a "end" keyword and the LSP's could just take care of the rest.
GIL and JIT are nice, but please give me end.
Naturally I can easily compile my own Python 3.13 version, no biggie.
However from my experience, this makes many people that could potentially try it out and give feedback, don't care and rather wait.