But in Python I could also imagine YET ANOTHER constant prefix, like t"", that returns a "template" object of some sort, and then you could do html(t"") or whatever. That is, it would be just like f"" but return an object and not a string so you could get at the underlying values. Like in JavaScript the ability to see the original backslash escaping would be nice, and as an improvement over JavaScript the ability to see the expressions as text would also be nice.
But the deferred evaluation seems iffy to me. Like I can see all the cool things one might do with it, but it also means you can't understand the evaluation without understanding the tag implementation.
Also I don't think deferred evaluation is enough to make this an opportunity for a "full" DSL. Something like a for loop requires introducing new variables local to the template/language, and that's really beyond what this should be, or what deferred evaluation would allow.
It just doesn't seem worth it to define a whole new thing just to abstract over a format() function call. The laziness might be interesting, but I feel like "lazy strings" might be all that's needed here. Laziness and validation (or custom string formatting logic) are separate concerns and should be separated.
If a syntax such as f"{variable}" is already a feature - and turned out to be a popular one - why shouldn't we be able to add our own custom "f"s? Because that is what this is about. It might make generating output even simpler.
I applaud the idea and am pleased to see that Python keeps innovating!
Instead, why not add a single new string prefix, like "l" for "lazy"? So, f"hello {name}" would immediately format it while l"hello {name}" would produce an object which contains a template and the captured variables. Then their example would be called like: greet(l"hello {name}").
@Language("application/sql")
String query = "SELECT 1";
@Language("application/graphql+json")
String query = """
query HeroNameAndFriends {
hero {
name
friends {
name
}
}
}
""";
Also the scoping rules of this are a special case which doesn't appear in regular Python code so far: "The use of annotation scope means it’s not possible to fully desugar interpolations into Python code. Instead it’s as if one is writing interpolation_lambda: tag, not lambda: tag, where a hypothetical interpolation_lambda keyword variant uses annotation scope instead of the standard function scope." -- i.e. it's "as if you wrapped all interpolation expressions in a lambda: <expr>, except it uses different scoping rules".
I hope they eventually grant this power to regular functions because otherwise I know folks will end up myfunc"{arg1},{arg2}" to get that feature.
def greet(*args: Decoded | Interpolation) -> str:
result = []
for arg in args:
match arg:
case Decoded() as decoded:
result.append(decoded)
case Interpolation() as interpolation:
value = interpolation.getvalue()
result.append(value.upper())
return f"{''.join(result)}!"
Isn't that just this? def greet(*args):
def convert(arg):
return arg.getvalue().upper() if hasattr(arg, getvalue) else arg
return ''.join(convert(arg) for arg in args) + '!'
1. https://protostrings.readthedocs.io/en/latest/index.xhtml
iirc I wanted to encode state in a recursive-descent parser without additional complexity to the parser.
Similar in purpose not design; protostrings provides lazy and context-sensitive strings from the bottom up rather than this top-down template-style proposal, which I feel addresses most of the concerns here.
- tags get a strings array that's referentially stable across invocations. This can function as a cache key to cache strings parsing work. - tags can return any kind of value, not just a string. Often you need to give structured data to another cooperating API.
Deferred evaluation of expressions is very cool, and would be really useful for reactive use-cases, assuming they can be evaluated multiple times.
Another kitchen sink to add to Python's world-class kitchen sink collection.
greet("hello {world}")
which walks the call stack to find the variables, and uses them as locals / arguments?If so: why not just do that? I would expect the performance to be kinda terrible compared to a language-intrinsic, but this hardly seems like a thing worth truly optimizing. And if it's too costly at runtime, someone can implement a parse-time AST rewriter, like many Python things already do. Heck, that example's `assert` is probably using pytest (because everyone uses pytest) and it's doing exactly this already, and it isn't a language feature, it's just a normal library using normal Python features.
From the article: """
~CURL[curl https://catfact.ninja/fact]
|> Req.request!()
This is actual code; you can run this. It will convert the curl command into a Req request and you will get a response back. This is really great, because we have been able to increase the expressiveness of the language.
"""
The concern isn't having features that will make it easier to write DSLs, my problem is that people will misuse it in regular Python projects.
I know that one of the authors are Guido, but I'm not buying the motivation. Jinja2 and Django template are pretty much just using Python, it's not really much of an issue, and I don't believe that business logic should exist in your templates anyway. As for the SQL argument, it will still be possible for people to mess it up, even with Tag Strings, unless you completely remove all legacy code. The issue here isn't that the existing facilities aren't good enough, it's that many developers aren't aware of the concepts, like prepared statements. If developers aren't reading the docs to learn about prepared statements, why would they do so for some DSL developed using tag strings?
Obviously Guido is a better developer than me any day, so I might be completely wrong, but this doesn't feel right. I've seen tools developed to avoid just doing the proper training, and the result is always worse.