mrweasel
My issue with this that is will eventually sneak into libraries and the users of that library would be expected to use these tag strings all over the place to utilize the library. This prevents people from having a uniform coding style and make code harder to read.

The concern isn't having features that will make it easier to write DSLs, my problem is that people will misuse it in regular Python projects.

I know that one of the authors are Guido, but I'm not buying the motivation. Jinja2 and Django template are pretty much just using Python, it's not really much of an issue, and I don't believe that business logic should exist in your templates anyway. As for the SQL argument, it will still be possible for people to mess it up, even with Tag Strings, unless you completely remove all legacy code. The issue here isn't that the existing facilities aren't good enough, it's that many developers aren't aware of the concepts, like prepared statements. If developers aren't reading the docs to learn about prepared statements, why would they do so for some DSL developed using tag strings?

Obviously Guido is a better developer than me any day, so I might be completely wrong, but this doesn't feel right. I've seen tools developed to avoid just doing the proper training, and the result is always worse.

dmart
I have to admit that at first glance I don’t like this. These seem to be essentially normal str -> Any functions, with some naming limitations due to the existing string prefixes special-cased in the language. I don’t feel like adding this additional complexity is worth being able to save two parentheses per function call.
ianbicking
I LOVE tagged templates in JavaScript.

But in Python I could also imagine YET ANOTHER constant prefix, like t"", that returns a "template" object of some sort, and then you could do html(t"") or whatever. That is, it would be just like f"" but return an object and not a string so you could get at the underlying values. Like in JavaScript the ability to see the original backslash escaping would be nice, and as an improvement over JavaScript the ability to see the expressions as text would also be nice.

But the deferred evaluation seems iffy to me. Like I can see all the cool things one might do with it, but it also means you can't understand the evaluation without understanding the tag implementation.

Also I don't think deferred evaluation is enough to make this an opportunity for a "full" DSL. Something like a for loop requires introducing new variables local to the template/language, and that's really beyond what this should be, or what deferred evaluation would allow.

TwentyPosts
This like a bad idea on the first glance? Maybe I don't get the whole pitch here?

It just doesn't seem worth it to define a whole new thing just to abstract over a format() function call. The laziness might be interesting, but I feel like "lazy strings" might be all that's needed here. Laziness and validation (or custom string formatting logic) are separate concerns and should be separated.

DataDive
Excellent idea, I don't get the criticism,

If a syntax such as f"{variable}" is already a feature - and turned out to be a popular one - why shouldn't we be able to add our own custom "f"s? Because that is what this is about. It might make generating output even simpler.

I applaud the idea and am pleased to see that Python keeps innovating!

Hamuko
I hate the idea of reusing the existing string/bytes prefixes for something that is completely different. How is someone expected to know that br"" is inherent Python syntax and my"" is essentially an user-defined function? And the only way to ever add a new prefix into the language (like f"" was added quite recently) is to wait until Python 4, at which point we'll need 3to4 to automatically rename all of your old tag strings that are now conflicting and people will bitch about how badly major Python upgrades suck.
ziml77
It seems the purpose of this proposal is to have a way to essentially have custom string interpolation. I don't think that's necessarily a bad idea on its own, but this syntax feels out of place to me.

Instead, why not add a single new string prefix, like "l" for "lazy"? So, f"hello {name}" would immediately format it while l"hello {name}" would produce an object which contains a template and the captured variables. Then their example would be called like: greet(l"hello {name}").

treyd
I can't help but believe that this is introducing more spooky action at a distance and is bound to be abused. Is it really more usable this way? Do they have any concrete and practical examples where this improves readability?
tofflos
I would have loved to see Java introduce something similar to the IntelliJ @Language-annotation in the standard library but maybe they'll figure out the sweet spot in a future String Templating JEP.

  @Language("application/sql")
  String query = "SELECT 1";

  @Language("application/graphql+json")
  String query = """
                 query HeroNameAndFriends {
                   hero {
                     name
                     friends {
                       name
                     }
                   }
                  }
                  """;
formerly_proven
Yikes. Don't get me wrong, I totally understand the reasoning why this would be useful (though I violently disagree with the idea of deferring the evaluation of the contained expressions), but it's also so very kitchensinky and adds so little over just calling a function (which doesn't require a 20-page explainer, as everyone already knows how function calls work). It also promotes using what looks like string interpolation (and what might be string interpolation, you can't tell at the "call site") for things which we know string interpolation is the wrong tool. The API also seems really, I dunno, weird to me. The string is split around interpolations and verbatim portions result in one argument, which is "string-like", while interpolations become four-tuple-like (one of which is a lambda, which you call to perform the deferred interpolation). This seems really awkward to me for building stuff like the suggested use cases of XML/HTML or SQL templating.

Also the scoping rules of this are a special case which doesn't appear in regular Python code so far: "The use of annotation scope means it’s not possible to fully desugar interpolations into Python code. Instead it’s as if one is writing interpolation_lambda: tag, not lambda: tag, where a hypothetical interpolation_lambda keyword variant uses annotation scope instead of the standard function scope." -- i.e. it's "as if you wrapped all interpolation expressions in a lambda: <expr>, except it uses different scoping rules".

cr125rider
This seems very unpythonic in the way that it breaks the one best way to do things adage. It’s syntax sugar for a function call. Just keep it a function call if needed.
Spivak
The much bigger feature here is buried under the DSL stuff. Python is effectively implementing a method of lazy evaluation of function arguments! I thought I would never see the day! It's crazy that if this PEP is accepted, functions in Python will actually be a special case of f-strings.

I hope they eventually grant this power to regular functions because otherwise I know folks will end up myfunc"{arg1},{arg2}" to get that feature.

nope96
Does this mean I can write a print function and be able to print"hello world" without parentheses again, like in python 2.x ?
pansa2
Off-topic, but when did Python become so... verbose? From the PEP:

    def greet(*args: Decoded | Interpolation) -> str:
        result = []
        for arg in args:
            match arg:
                case Decoded() as decoded:
                    result.append(decoded)
                case Interpolation() as interpolation:
                    value = interpolation.getvalue()
                    result.append(value.upper())

        return f"{''.join(result)}!"
Isn't that just this?

    def greet(*args):
        def convert(arg):
            return arg.getvalue().upper() if hasattr(arg, getvalue) else arg

        return ''.join(convert(arg) for arg in args) + '!'
orbisvicis
This seems similar to my protostrings library [1] which I wrote years ago and mostly forgot about till now.

1. https://protostrings.readthedocs.io/en/latest/index.xhtml

iirc I wanted to encode state in a recursive-descent parser without additional complexity to the parser.

Similar in purpose not design; protostrings provides lazy and context-sensitive strings from the bottom up rather than this top-down template-style proposal, which I feel addresses most of the concerns here.

spankalee
I tried skimming the PEP while I could, but it seems like this might be missing a couple of the features that make JS tagged template literals work so well:

- tags get a strings array that's referentially stable across invocations. This can function as a cache key to cache strings parsing work. - tags can return any kind of value, not just a string. Often you need to give structured data to another cooperating API.

Deferred evaluation of expressions is very cool, and would be really useful for reactive use-cases, assuming they can be evaluated multiple times.

jtwaleson
For people adding insightful critique on the PEP on HN (I saw some on this thread already), please ensure your opinion is represented in the PEP thread itself too.
samatman
I think this will turn out well. Julia has had this forever as string macros, and it has worked out rather nicely, features like `r"\d+"` for regex, and `raw"strings"` are just string macros. The set of all useful custom literal strings isn't bounded, so a lightweight mechanism to define them and make use of the results is a good thing.

Another kitchen sink to add to Python's world-class kitchen sink collection.

kbd
At least in the spirit of "the language shouldn't be able to define things the user can't" (see: Java string concatenation) this seems like a good change.
Too
Looks good. Would have been nice if they included a way to express type checking of the format_spec. That’s going to be an unnecessary source of runtime errors.
zoogeny
I've seen this feature used responsibly and to good effect in a few TypeScript projects so I understand why it would be desirable in Python.
agumonkey
Seems like many languages are allowing compile time interception (zig, es, now python)
Groxx
... is this any different than a function like this:

    greet("hello {world}")
which walks the call stack to find the variables, and uses them as locals / arguments?

If so: why not just do that? I would expect the performance to be kinda terrible compared to a language-intrinsic, but this hardly seems like a thing worth truly optimizing. And if it's too costly at runtime, someone can implement a parse-time AST rewriter, like many Python things already do. Heck, that example's `assert` is probably using pytest (because everyone uses pytest) and it's doing exactly this already, and it isn't a language feature, it's just a normal library using normal Python features.

behnamoh
I want this in Python: https://codecodeship.com/blog/2024-06-03-curl_req

From the article: """

    ~CURL[curl https://catfact.ninja/fact]
    |> Req.request!()
This is actual code; you can run this. It will convert the curl command into a Req request and you will get a response back. This is really great, because we have been able to increase the expressiveness of the language. """