-
-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] PEP 484: Describe a way of annotating decorated declarations #242
Conversation
One related issue: |
pep-0484.txt
Outdated
|
||
class DatabaseSession: ... | ||
|
||
@contextmanager # type: Callable[[str], ContextManager[DatabaseSession]] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Just my 5 cents: what about cast(Callable[[str], ContextManager[DatabaseSession]], 'session')
? Should this be also allowed here? The point is to (maybe) provide some support for runtime introspection tools, etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Huh. I can imagine a new decorator that works like so:
@cast_decorator(Callable[[str], ContextManager[DatabaseSession]])
@contextmanager
def session(...): ...
That would have runtime inspectability. Is it clunkier? I haven't let it bounce around in my head enough to tell.
Is that the general idea you're getting at?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is that the general idea you're getting at?
Yes. Maybe just cast
could be tweaked to be acceptable here. For example:
def cast(*args):
if len(args) == 2:
return args[1]
if len(args) == 1:
return lambda x: x # Maybe the type could be stored here on 'x'?
Then it could be used like this:
cast(List[int], []) # Returns []
@cast(Callable[[str], ContextManager[DatabaseSession]])
@contextmanager
def session(...): ...
But this is just an idea, maybe it is not very useful.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't like the overloading of cast -- it feels too magical. However, I might prefer using a new decorator instead of a type comment, primarily because Python 3.6 mostly doesn't require type comments any more and they feel like a legacy feature, and also because a decorator would allow splitting the annotation to multiple lines.
I wouldn't call this a cast but a declaration. A type checker should still check that the inferred signature is compatible with the declared one. This is unlike casts, which a type checker should blindly accept.
Here are ideas for the syntax:
@decorator_type(Callable[[int], str])
@mydecorator
def f(...): ...
@decorated_type(Callable[[int], str])
@mydecorator
def f(...): ...
@decorated_type([int], str) # convenience 2-argument syntax that doesn't require Callable[...]
@mydecorator
def f(...): ...
@decorated_signature([int], str)
@mydecorator
def f(...): ...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wouldn't call this a cast but a declaration. A type checker should still check that the inferred signature is compatible with the declared one.
👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool, yes, good distinction between cast and declaration.
Of those possibilities, I like the decorated_type
wording best.
I definitely like the full Callable
-included syntax, because
- it doesn't introduce any new concepts
- there's no requirement that the type of a decorated function be a function
- it applies equally well to decorated classes, if we need such a thing there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From the static analysis perspective, both comment-based and decorator-based annotations are OK. But in in the context of PEP 526 variable annotations having comment-based annotations feels a bit outdated. A Python 3.6 program would be free of comment-based annotations if we chose a decorator-based approach.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Word. Rewrote with a decorator-based approach, and liking the overall thing more now. See what y'all think?
pep-0484.txt
Outdated
finally: | ||
s.close() | ||
|
||
If both the decorator and the function being decorated have type |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sixolet Could you clarify this paragraph a bit? It's not clear if "decorator" refers to the decorated declaration or the decorator function itself.
If "decorator" here is the decorated declaration (@contextmanager
), then its type isn't really related to the type of the function being decorated. In the example above there is no point in matching Callable[[str], Generator[DatabaseSession, None, None]
against Callable[[str], ContextManager[DatabaseSession]]
.
Alternatively, if "decorator" here means the decorator function itself (def contextmanager(...): ...
, then this paragraph is not related to the idea of adding type hints for decorated declarations.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So, my meaning was something like "if both the definition of the decorator and the definition of the function being decorated have type annotations..."
And this is relevant to the behavior in that it's the same difference between an assignment of an Any
-type value to a variable with a declared type, and the assignment of a value with a known type to a variable with a declared type. Is there a clearer way to explain that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sixolet I would be happy with just extending "decorator" to "the definition of the decorator" or "the decorator function".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I completely rewrote it to attempt to avoid the confusion. See if you're happy?
Tagging @JukkaL |
pep-0484.txt
Outdated
class DatabaseSession: ... | ||
|
||
@contextmanager # type: Callable[[str], ContextManager[DatabaseSession]] | ||
def session(url: str) -> Generator[DatabaseSession, None, None]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would rather use Iterator[DatabaseSession]
instead of Generator[DatabaseSession, None, None]
. It is equivalent, but reads simpler.
pep-0484.txt
Outdated
|
||
Decorators can modify the types of the functions or classes they | ||
decorate. Use the `decorated_type` decorator to declare the type of | ||
the resulting item after all other decorators have been applied.:: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it should be applied::
(without dot)
pep-0484.txt
Outdated
---------- | ||
|
||
Decorators can modify the types of the functions or classes they | ||
decorate. Use the `decorated_type` decorator to declare the type of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is better to use double backquotes:
``decorated_type``
here and in few places below (this will make the reST style consistent with the rest of the PEP).
LGTM now apart from two typographical nits. |
Didn't follow all the discussion but I like this, and it's easier to implement (no need to change typed_ast). Not sure if @JukkaL gave it his seal of approval yet? |
Looks good to me. |
Is this ready to be merged then? |
Well, at this point it's a proposed addition to PEP 484. Can you hold off for now? |
@gvanrossum yep, no rush; just couldn't tell if I should merge it or not since everyone seemed happy and I was going through my notifications on GitHub. |
In https://mail.python.org/pipermail/python-dev/2017-May/148070.html I am arguing that I prefer this proposal over Jukka's alternative. |
Hmm. Guido's linked mail reconvinced me; I was wavering towards Jukka's
solution. I think the main thing that reconvinced me was that my proposal
doesn't have any action-at-a-distance -- all the type info is right there.
…On Tue, May 30, 2017 at 2:03 PM, Guido van Rossum ***@***.***> wrote:
In https://mail.python.org/pipermail/python-dev/2017-May/148070.html I am
arguing that I prefer this proposal over Jukka's alternative.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#242 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/ABjs4WItYc5hK_onkhhGD987ZbY9kBlVks5r_IQLgaJpZM4NBQJW>
.
|
Here are some further thoughts (from an offline discussion with @gvanrossum). I'd prefer to have an implementation in mypy + It's not entirely clear if this would help with annotating functions that use variadic type variables (these are a separate proposal that hasn't been accepted), and I'd prefer not to have invent a separate mechanism for those. Guido suggested that we could reuse this syntax in order to allow using variadic type variables in the externally visible signature of a function only. The motivation is that it may be impractical to fully type check many or most function bodies that use variadic type variables. We could treat them as Ts = TypeVar('Ts', variadic=True)
RT = TypeVar('RT')
@decorated_type(Callable[[Arg(Callable[[int, Expand[Ts]], RT], 'fn'),
Arg(Tuple[Expand[Ts]], 'args')], RT])
def call_fn(fn: Callable[..., RT], args: Tuple[Any, ...]) -> RT:
return fn(0, *args) I was initially worried that Finally, the fancy Callable syntax is still only in All in all I'm ambivalent between the two proposals. I'm okay with going with Naomi's proposal -- but I'd like to add some wording about using the decorator on otherwise undecorated functions, and I'd like to go through at least one mypy release cycle with the feature in |
I like the idea of variadic types being only in I actually like the (Don't worry, I won't be offended if we sit on variadic types for another few months discussing them) |
I like this version of the PEP patch. We should not merge it though until we've implemented this (using mypy_extensions first) and we're happy with it in practice. |
---------- | ||
|
||
Decorators can modify the types of the functions or classes they | ||
decorate. Use the ``decorated_type`` decorator to declare the type of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Oh wait, we're going with declared_type
instead right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Change to declared_type
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[Once more with feeling] Change to declared_type
.
|
||
Decorators can modify the types of the functions or classes they | ||
decorate. Use the ``decorated_type`` decorator to declare the type of | ||
the resulting item after all other decorators have been applied:: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might also tweak the words to indicate that there may not be any other decorators.
multiple decorators, ``decorated_type`` must be topmost. The | ||
``decorated_type`` decorator is invalid on a function declaration that | ||
is also decorated with ``overload``, but you can annotate the | ||
implementation of the overload series with ``decorated_type``. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we clarify (via example and/or specification language) that the type in @decorated_type
may differ from the naturally inferred type, and that the inferred type must be assignable to the declared type? (Like in assignments or passing an argument to a function.) E.g. either one may have an Any
where the other has something more specific.
I'm not sure if the semantics should exactly match those of assignment, though that would be my first approximation until we find a counterexample.
class DatabaseSession: ... | ||
|
||
@decorated_type(Callable[[str], ContextManager[DatabaseSession]]) | ||
@contextmanager |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
May need a different example than contextmanager
since (at least mypy) now gets the type right there through the use of plugins.
Is anything going on with this PR or should I just close it? |
This is still a topic of interest. The plan is to prototype this in mypy first, then update the PEP. The timeline is unclear though. I'd like to keep this PR open unless it bother you. |
There's a pull request out in Jukka's queue in mypy and has been for a
couple months, if I remember correctly.
…On Wed, Aug 16, 2017 at 4:37 PM, Guido van Rossum ***@***.***> wrote:
Is anything going on with this PR or should I just close it?
This is still a topic of interest. The plan is to prototype this in mypy
first, then update the PEP. The timeline is unclear though. I'd like to
keep this PR open unless it bother you.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#242 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/ABjs4aEKj6zVXYKfJe0-76RduUwHJwp_ks5sY30ggaJpZM4NBQJW>
.
|
@gvanrossum it can stay open, I just changed the title to have "[WIP]" to know that it's blocked on something and that's why it's open and it can otherwise be ignored. |
What's the status of this PR? |
We need to discuss this on typing-sig. I'll send an email there. |
Closing per typing-sig discussion. Nobody spoke up who thinks we should give this priority. |
Hey! Thanks for your work on this! I have kind of an interesting case that's coming up because of mypy tripping on decorators. Manual annotations could help cases like this, but it might be broader issue. This seems like the best place to comment on it though -- please redirect me if I should ask elsewhere. I've built a stackable decorator (kinda like a monoid) for doing some algebraic-ish function composition with additional safety checks along the way. It's pretty neat, if I do say so myself. Here's a minimal repro (you can extrapolate how this can be super useful!). It's relatively complicated, so I've broken it down step by step afterwards. from __future__ import annotations
from typing import Generic, Union, Callable, TypeVar
T = TypeVar("T")
class BuildCombinator(Generic[T]):
def __init__(self, x: Callable[[T], int]) -> None:
self.x = x
@staticmethod
def build(x: Callable[[T], int]) -> BuildCombinator[T]:
return BuildCombinator[T](x)
def __call__(
self, func: Union[BaseCombinator[T], BuildCombinator[T,]]
) -> Union[BaseCombinator[T], BuildCombinator[T]]:
if isinstance(func, BaseCombinator):
bc : BaseCombinator[T] = func
return BaseCombinator[T](lambda t: bc.n(t) * self.x(t), bc.f)
elif isinstance(func, BuildCombinator):
blc: BuildCombinator[T] = func
return BuildCombinator[T](lambda t: blc.x(t) * self.x(t))
else:
assert False
def do_thing(self, obj:T) -> str:
return str(self.x(obj))
class BaseCombinator(Generic[T]):
def __init__(self, n: Callable[[T], int], f: Callable[[T, str], str]) -> None:
self.n = n
self.f = f
@staticmethod
def base(x: Callable[[T, str], str]) -> BaseCombinator[T]:
return BaseCombinator[T](lambda t: 1, x)
def do_thing(self, obj:T) -> str:
return self.f(obj, str(self.n(obj)))
class A:
@BuildCombinator.build
def mul_3(self) -> int:
return 3
@BaseCombinator.base
def one(self, s: str) -> str:
return s
@mul_3
@BaseCombinator.base
def three(self, s: str) -> str:
return s
@mul_3
@mul_3
@BaseCombinator.base
def nine(self, s: str) -> str:
return s
@BuildCombinator.build
def mul_5(self) -> int:
return 5
@mul_3
@mul_5
@BuildCombinator.build
def mul_15(self) -> int:
return 1
@BuildCombinator.build
def mul_7(self) -> int:
return 7
@mul_15
@BaseCombinator.base
def broken(self, s: str) -> str:
return s
@mul_5
@mul_3
@BaseCombinator.base
def works(self, s: str) -> str:
return s
if __name__ == "__main__":
a = A()
print(a.one.do_thing(a), 1)
print(a.three.do_thing(a), 3)
print(a.nine.do_thing(a), 9)
print(a.broken.do_thing(a), 15)
print(a.works.do_thing(a), 15) This prints out what you'd expect. But mypy complains with:
Both of the compositions should be yielding the exact same compositions, but the types don't propagate. Interestingly when I use typing.get_type_hints on a.broken.do_thing and a.works.do_thing, I get That's kinda complex, so I can quickly walk through what the code is doing. Essentially I have a base case function template, that is like this:
Anything that matches that template can be decorated like so:
build transforms x into a a "layer" of computation that operates on "bases", e.g.:
So far, so good. Code compiles and works and type checks, beautifully might I add. I'm a very happy user! But it gets a little bit harry and I get some mysterious type error. I have a few different "base" combinators that have different semantics, e.g. you could imagine one that uses the @ build combinators to pass arguments, or one that uses them to log debugging info. The important thing is that these base cases are typed, and are aware of the type of the class A that they are inside of.
Now, the catch is that the BuildCombinator itself is allowed to combine on itself recursively. Once it's "in the monad" you can keep on stacking new effects. I've used multiplication because prime factoring is helpful for thinking about it, but the effects can be whatever.
This is where the issues come in. When you stack directly on a function from a base case it's fine. E.g.,
but when you try to stack on a
But the program still seems to be actually correct, it's just the type checker. |
So it seems that some of this issue boils down to the fact that I'm returning a Union from my combinator rather than using a singledispatchmethod. That wasn't leaping out to me as the reason originally, but it makes sense. It's an odd edge that the combinator keeps the non-union type information around in one case and not the other. A fix for this conceptually would be to use singledispatchmethod, but that bears it's own issues as the typing of singledispatchmethod is pending a fix, and it's not clear how that information gets propagated from the base call for variadic return types. |
As an interim solution, I've added a new method that's @build.stack with non union return, this clears my issue in the near term, but it still seems fixable longer term. |
Sorry, your comments here will go unread. If you want to discuss an addition to the type system, please,use the typing-sig mailing list or the python/typing tracker on GitHub. |
We want to be able to declare the fully-decorated type of a decorated declaration:
In relation to python/typing#412