python "yield"キーワードは何をしますか?



15 Answers

Grokking yieldへのショートカット

yield文を含む関数を見るときは、この簡単なやりかたを適用して何が起こるかを理解してください:

  1. 関数の先頭にline result = []を挿入します。
  2. yield exprresult.append(expr)置き換えます。
  3. 関数の最後に改行return resultを挿入します。
  4. よろしくお願いします。 コードを読み、理解する。
  5. 機能を元の定義と比較する。

このトリックは、関数の背後にあるロジックのアイデアを与えるかもしれませんが、実際にはyieldがどうなるかは、リストベースのアプローチで起こることとは大きく異なります。 多くの場合、歩留まりのアプローチは、より多くのメモリ効率と高速になります。 他のケースでは、このトリックは元の関数が正常に機能していても無限ループに陥るでしょう。 詳細はこちらを参照してください...

イテラブル、イテレータ、ジェネレータを混乱させないでください

まず、 イテレータプロトコル - あなたが書くとき

for x in mylist:
    ...loop body...

Pythonは次の2つのステップを実行します。

  1. mylistイテレータを取得します。

    iter(mylist)呼び出す - >これは、 next()メソッド(またはPython 3では__next__()を持つオブジェクトを返します。

    [これはほとんどの人があなたに言っていることを忘れているステップです]

  2. イテレータを使用して項目をループする:

    ステップ1から返されたイテレータに対してnext()メソッドを呼び出したままにします。next next()の戻り値がx割り当てられ、ループ本体が実行されます。 next()内から例外StopIterationが発生した場合は、イテレータに値がなくなり、ループが終了したことを意味します。

真実は、Pythonがオブジェクトの内容をループしたいときはいつでも、上記の2つのステップを実行するので、forループになる可能性がありますが、 otherlist.extend(mylist)ようなコードでもotherlist.extend(mylist)ません( otherlistはPythonリストです) 。

ここでは、iteratorプロトコルを実装しているため、 mylistiterableです。 ユーザ定義のクラスでは、 __iter__()メソッドを実装して、クラスのインスタンスを反復可能にすることができます。 このメソッドはイテレータを返す必要があります 。 イテレータはnext()メソッドを持つオブジェクトです。 同じクラスで__iter__()next()両方を実装し、 __iter__() __iter__() self戻すことができます。 これは単純なケースでは機能しますが、同じオブジェクトに対して2つのイテレーターを同時にループさせたい場合は使用できません。

それがイテレータプロトコルです。多くのオブジェクトがこのプロトコルを実装しています。

  1. 組み込みのリスト、辞書、タプル、セット、ファイル。
  2. __iter__()を実装するユーザー定義クラス。
  3. 発電機。

forループは、どのオブジェクトが扱われているのかわからないことに注意してください。イテレータプロトコルに従うだけで、 next()呼び出すときにitemの後にitemを取得することができます。 ビルトインリストはアイテムを1つずつ返し、辞書はキーを 1つずつ返し、ファイルは1つずつラインを返します。そしてジェネレータは戻ります。

def f123():
    yield 1
    yield 2
    yield 3

for item in f123():
    print item

yield文の代わりに、 f123() 3つのreturn文がある場合、最初のものだけが実行され、関数は終了します。 しかしf123()は普通の関数ではありません。 f123()が呼び出されると、yield文の値返されません ! ジェネレータオブジェクトを返します。 また、関数は実際には終了せず、中断状態になります。 forループがジェネレータオブジェクトforループしようとすると、関数は以前に返されたyield直後の行でサスペンド状態から再開し、次のコード行(この場合はyield文)を実行し、次の項目。 これは、関数が終了するまで発生し、その時点でジェネレータはStopIteration発生し、ループは終了します。

したがって、ジェネレータオブジェクトはアダプタのようなものです。一方では、 __iter__()およびnext()メソッドを公開してforループを幸せに保ち、イテレータプロトコルを示します。 しかし、もう一方の端では、次の値を取得するのに十分なだけ関数を実行し、それを中断モードに戻します。

ジェネレータを使用する理由

通常、ジェネレータを使用せず同じロジックを実装するコードを書くことができます。 1つの選択肢は、以前に言及した一時的なリスト 'トリック'を使用することです。 例えば、無限ループがある場合や、本当に長いリストがある場合は、メモリを非効率的に使用するなど、すべての場合に動作しません。 もう1つの方法は、インスタンスメンバーに状態を保持し、 next() (またはPython 3の__next__() )メソッドで次の論理的なステップを実行する新しいiterableクラスSomethingIterを実装することです。 ロジックによっては、 next()メソッド内のコードが非常に複雑になり、バグが発生する可能性があります。 ここでジェネレータはクリーンで簡単なソリューションを提供します。

python iterator generator yield coroutine

Pythonでyieldキーワードを使うのは何ですか? それは何をするためのものか?

たとえば、私はこのコードを理解しようとしています1

def _get_child_candidates(self, distance, min_dist, max_dist):
    if self._leftchild and distance - max_dist < self._median:
        yield self._leftchild
    if self._rightchild and distance + max_dist >= self._median:
        yield self._rightchild  

これは呼び出し側です。

result, candidates = [], [self]
while candidates:
    node = candidates.pop()
    distance = node._get_dist(obj)
    if distance <= max_dist and distance >= min_dist:
        result.extend(node._values)
    candidates.extend(node._get_child_candidates(distance, min_dist, max_dist))
return result

_get_child_candidatesメソッドが呼び出されるとどうなりますか? リストは返されますか? 単一の要素ですか? それは再び呼び出されますか? その後の通話はいつ中止されますか?

1.コードは、Jochen Schulz(jrschulz)が提供しています。 これは完全なソースへのリンクです: Module mspace




yieldキーワードは2つの単純な事実に還元されます:

  1. コンパイラが関数内のどこかで yieldキーワードを検出した場合、その関数はreturn文では返されません。 代わりに 、ジェネレータと呼ばれる遅延した「保留リスト」オブジェクトを すぐに返します
  2. ジェネレータは反復可能です。 iterableとは何ですか? これは、 listsetrangeまたはdict-viewのようなもので、 各要素を特定の順序で訪問するためのプロトコルが組み込まれています

一言で言えば、ジェネレータは遅延している、漸進的に保留中のリストで、 yieldステートメントを使用すると、関数表記を使用してジェネレータが徐々に吐き出されるリスト値をプログラムできます

generator = myYieldingFunction(...)
x = list(generator)

   generator
       v
[x[0], ..., ???]

         generator
             v
[x[0], x[1], ..., ???]

               generator
                   v
[x[0], x[1], x[2], ..., ???]

                       StopIteration exception
[x[0], x[1], x[2]]     done

list==[x[0], x[1], x[2]]

Pythonのrange似たmakeRange関数を定義しましょう。 makeRange(n)呼び出すA GENERATORをmakeRange(n)

def makeRange(n):
    # return 0,1,2,...,n-1
    i = 0
    while i < n:
        yield i
        i += 1

>>> makeRange(5)
<generator object makeRange at 0x19e4aa0>

ジェネレータが保留中の値をすぐに返すようにするには、リストアすることができます(任意のiterableと同じように)。

>>> list(makeRange(5))
[0, 1, 2, 3, 4]

例を「リストを返す」と比較すると、

上記の例は、単に追加して返すリストを作成するものと考えることができます:

# list-version                   #  # generator-version
def makeRange(n):                #  def makeRange(n):
    """return [0,1,2,...,n-1]""" #~     """return 0,1,2,...,n-1"""
    TO_RETURN = []               #>
    i = 0                        #      i = 0
    while i < n:                 #      while i < n:
        TO_RETURN += [i]         #~         yield i
        i += 1                   #          i += 1  ## indented
    return TO_RETURN             #>

>>> makeRange(5)
[0, 1, 2, 3, 4]

しかし、大きな違いが1つあります。 最後のセクションを参照してください。

ジェネレータの使い方

iterableはリストの理解の最後の部分であり、すべてのジェネレータはiterableなので、しばしば以下のように使われます:

#                   _ITERABLE_
>>> [x+10 for x in makeRange(5)]
[10, 11, 12, 13, 14]

ジェネレータの感触を良くするには、 itertoolsモジュールを使用してchain.from_iterablechain.from_iterable (保証されている場合はchainではなくchain.from_iterableを使用してchain.from_iterable )。 たとえば、ジェネレータを使用して、 itertools.count()ような無限に長い遅延リストを実装することさえできます。 独自のdef enumerate(iterable): zip(count(), iterable)実装することもできますし、whileループでyieldキーワードを使用することもできます。

注意:ジェネレータは、実際にはコルーチン非決定論的プログラミングやその他のエレガントなものの実装など、もっと多くのものに使用できます。 しかし、私がここに示した「怠惰なリスト」の観点は、あなたが見つける最も一般的な使い方です。

舞台裏

これが「Python反復プロトコル」の仕組みです。 つまり、 list(makeRange(5))するときに何が起こっているのですかlist(makeRange(5)) 。 これは私が以前に "怠惰で増分的なリスト"として記述したものです。

>>> x=iter(range(5))
>>> next(x)
0
>>> next(x)
1
>>> next(x)
2
>>> next(x)
3
>>> next(x)
4
>>> next(x)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
StopIteration

組み込み関数next()はオブジェクト.next() next()を呼び出します。これは "反復プロトコル"の一部であり、すべての反復子にあります。 next()関数(および反復プロトコルの他の部分next()手動で使用して、通常は読みやすさを犠牲にして幻想的なものを実装することができます。

マヌティエ

通常、ほとんどの人は以下の区別に気をつけず、おそらくここで読むことを止めたいと思うでしょう。

Pythonでは、 iterableは、リスト[1,2,3]ような "forループの概念を理解する"オブジェクトであり、 イテレータ[1,2,3].__iter__()ように要求されたfor-loopの特定のインスタンスです[1,2,3].__iter__()ジェネレータは、(関数構文を使用して)書かれた方法を除いて、どのイテレータとまったく同じです。

イテレータをリストから要求すると、新しいイテレータが作成されます。 しかし、イテレータをイテレータから要求すると(まれにしか行いません)、それはあなた自身のコピーを提供します。

したがって、あなたがこのようなことをやっていないことは間違いありません...

> x = myRange(5)
> list(x)
[0, 1, 2, 3, 4]
> list(x)
[]

...ジェネレータはイテレータであることを忘れないでください。 つまり、1回限りの使用です。 再利用したい場合は、 myRange(...)もう一度呼び出す必要があります。 結果を2回使用する必要がある場合は、結果をリストに変換し、変数x = list(myRange(5))ます。 コピー可能なイテレータのPython PEP標準提案が延期されているので、絶対に必要な場合には、ジェネレータをクローンする必要がある人(たとえば、恐ろしくハックしたメタプログラミングをしている人)は、 itertools.teeを使用できます。




yield is just like return - it returns whatever you tell it to (as a generator). The difference is that the next time you call the generator, execution starts from the last call to the yield statement. Unlike return, the stack frame is not cleaned up when a yield occurs, however control is transferred back to the caller, so its state will resume the next time the function.

In the case of your code, the function get_child_candidates is acting like an iterator so that when you extend your list, it adds one element at a time to the new list.

list.extend calls an iterator until it's exhausted. In the case of the code sample you posted, it would be much clearer to just return a tuple and append that to the list.




For those who prefer a minimal working example, meditate on this interactive Python session:

>>> def f():
...   yield 1
...   yield 2
...   yield 3
... 
>>> g = f()
>>> for i in g:
...   print i
... 
1
2
3
>>> for i in g:
...   print i
... 
>>> # Note that this time nothing was printed



TL; DR

これの代わりに:

def squares_list(n):
    the_list = []                         # Replace
    for x in range(n):
        y = x * x
        the_list.append(y)                # these
    return the_list                       # lines

これを行う:

def squares_the_yield_way(n):
    for x in range(n):
        y = x * x
        yield y                           # with this one.

Whenever you find yourself building a list from scratch, yield each piece instead.

This was my first "aha" moment with yield.

yield is a sugary way to say

build a series of stuff

Same behavior:

>>> for square in squares_list(4):
...     print(square)
...
0
1
4
9
>>> for square in squares_the_yield_way(4):
...     print(square)
...
0
1
4
9

Different behavior:

Yield is single-pass : you can only iterate through once. When a function has a yield in it we call it a generator function . And an iterator is what it returns. That's revealing. We lose the convenience of a container, but gain the power of an arbitrarily long series.

Yield is lazy , it puts off computation. A function with a yield in it doesn't actually execute at all when you call it. The iterator object it returns uses magic to maintain the function's internal context. Each time you call next() on the iterator (this happens in a for-loop) execution inches forward to the next yield. ( return raises StopIteration and ends the series.)

Yield is versatile . It can do infinite loops:

>>> def squares_all_of_them():
...     x = 0
...     while True:
...         yield x * x
...         x += 1
...
>>> squares = squares_all_of_them()
>>> for _ in range(4):
...     print(next(squares))
...
0
1
4
9

If you need multiple passes and the series isn't too long, just call list() on it:

>>> list(squares_the_yield_way(4))
[0, 1, 4, 9]

Brilliant choice of the word yield because both meanings apply:

yield — produce or provide (as in agriculture)

...provide the next data in the series.

yield — give way or relinquish (as in political power)

...relinquish CPU execution until the iterator advances.




There is one type of answer that I don't feel has been given yet, among the many great answers that describe how to use generators. Here is the programming language theory answer:

The yield statement in Python returns a generator. A generator in Python is a function that returns continuations (and specifically a type of coroutine, but continuations represent the more general mechanism to understand what is going on).

Continuations in programming languages theory are a much more fundamental kind of computation, but they are not often used, because they are extremely hard to reason about and also very difficult to implement. But the idea of what a continuation is, is straightforward: it is the state of a computation that has not yet finished. In this state, the current values of variables, the operations that have yet to be performed, and so on, are saved. Then at some point later in the program the continuation can be invoked, such that the program's variables are reset to that state and the operations that were saved are carried out.

Continuations, in this more general form, can be implemented in two ways. In the call/cc way, the program's stack is literally saved and then when the continuation is invoked, the stack is restored.

In continuation passing style (CPS), continuations are just normal functions (only in languages where functions are first class) which the programmer explicitly manages and passes around to subroutines. In this style, program state is represented by closures (and the variables that happen to be encoded in them) rather than variables that reside somewhere on the stack. Functions that manage control flow accept continuation as arguments (in some variations of CPS, functions may accept multiple continuations) and manipulate control flow by invoking them by simply calling them and returning afterwards. A very simple example of continuation passing style is as follows:

def save_file(filename):
  def write_file_continuation():
    write_stuff_to_file(filename)

  check_if_file_exists_and_user_wants_to_overwrite(write_file_continuation)

In this (very simplistic) example, the programmer saves the operation of actually writing the file into a continuation (which can potentially be a very complex operation with many details to write out), and then passes that continuation (ie, as a first-class closure) to another operator which does some more processing, and then calls it if necessary. (I use this design pattern a lot in actual GUI programming, either because it saves me lines of code or, more importantly, to manage control flow after GUI events trigger.)

The rest of this post will, without loss of generality, conceptualize continuations as CPS, because it is a hell of a lot easier to understand and read.


Now let's talk about generators in Python. Generators are a specific subtype of continuation. Whereas continuations are able in general to save the state of a computation (ie, the program's call stack), generators are only able to save the state of iteration over an iterator . Although, this definition is slightly misleading for certain use cases of generators. 例えば:

def f():
  while True:
    yield 4

This is clearly a reasonable iterable whose behavior is well defined -- each time the generator iterates over it, it returns 4 (and does so forever). But it isn't probably the prototypical type of iterable that comes to mind when thinking of iterators (ie, for x in collection: do_something(x) ). This example illustrates the power of generators: if anything is an iterator, a generator can save the state of its iteration.

To reiterate: Continuations can save the state of a program's stack and generators can save the state of iteration. This means that continuations are more a lot powerful than generators, but also that generators are a lot, lot easier. They are easier for the language designer to implement, and they are easier for the programmer to use (if you have some time to burn, try to read and understand this page about continuations and call/cc ).

But you could easily implement (and conceptualize) generators as a simple, specific case of continuation passing style:

Whenever yield is called, it tells the function to return a continuation. When the function is called again, it starts from wherever it left off. So, in pseudo-pseudocode (ie, not pseudocode, but not code) the generator's next method is basically as follows:

class Generator():
  def __init__(self,iterable,generatorfun):
    self.next_continuation = lambda:generatorfun(iterable)

  def next(self):
    value, next_continuation = self.next_continuation()
    self.next_continuation = next_continuation
    return value

where the yield keyword is actually syntactic sugar for the real generator function, basically something like:

def generatorfun(iterable):
  if len(iterable) == 0:
    raise StopIteration
  else:
    return (iterable[0], lambda:generatorfun(iterable[1:]))

Remember that this is just pseudocode and the actual implementation of generators in Python is more complex. But as an exercise to understand what is going on, try to use continuation passing style to implement generator objects without use of the yield keyword.




While a lot of answers show why you'd use a yield to create a generator, there are more uses for yield . It's quite easy to make a coroutine, which enables the passing of information between two blocks of code. I won't repeat any of the fine examples that have already been given about using yield to create a generator.

To help understand what a yield does in the following code, you can use your finger to trace the cycle through any code that has a yield . Every time your finger hits the yield , you have to wait for a next or a send to be entered. When a next is called, you trace through the code until you hit the yield … the code on the right of the yield is evaluated and returned to the caller… then you wait. When next is called again, you perform another loop through the code. However, you'll note that in a coroutine, yield can also be used with a send … which will send a value from the caller into the yielding function. If a send is given, then yield receives the value sent, and spits it out the left hand side… then the trace through the code progresses until you hit the yield again (returning the value at the end, as if next was called).

例えば:

>>> def coroutine():
...     i = -1
...     while True:
...         i += 1
...         val = (yield i)
...         print("Received %s" % val)
...
>>> sequence = coroutine()
>>> sequence.next()
0
>>> sequence.next()
Received None
1
>>> sequence.send('hello')
Received hello
2
>>> sequence.close()



I was going to post "read page 19 of Beazley's 'Python: Essential Reference' for a quick description of generators", but so many others have posted good descriptions already.

Also, note that yield can be used in coroutines as the dual of their use in generator functions. Although it isn't the same use as your code snippet, (yield) can be used as an expression in a function. When a caller sends a value to the method using the send() method, then the coroutine will execute until the next (yield) statement is encountered.

Generators and coroutines are a cool way to set up data-flow type applications. I thought it would be worthwhile knowing about the other use of the yield statement in functions.




From a programming viewpoint, the iterators are implemented as thunks .

To implement iterators, generators, and thread pools for concurrent execution, etc. as thunks (also called anonymous functions), one uses messages sent to a closure object, which has a dispatcher, and the dispatcher answers to "messages".

http://en.wikipedia.org/wiki/Message_passing

" next " is a message sent to a closure, created by the " iter " call.

There are lots of ways to implement this computation. I used mutation, but it is easy to do it without mutation, by returning the current value and the next yielder.

Here is a demonstration which uses the structure of R6RS, but the semantics is absolutely identical to Python's. It's the same model of computation, and only a change in syntax is required to rewrite it in Python.

Welcome to Racket v6.5.0.3.

-> (define gen
     (lambda (l)
       (define yield
         (lambda ()
           (if (null? l)
               'END
               (let ((v (car l)))
                 (set! l (cdr l))
                 v))))
       (lambda(m)
         (case m
           ('yield (yield))
           ('init  (lambda (data)
                     (set! l data)
                     'OK))))))
-> (define stream (gen '(1 2 3)))
-> (stream 'yield)
1
-> (stream 'yield)
2
-> (stream 'yield)
3
-> (stream 'yield)
'END
-> ((stream 'init) '(a b))
'OK
-> (stream 'yield)
'a
-> (stream 'yield)
'b
-> (stream 'yield)
'END
-> (stream 'yield)
'END
->



ここに簡単な例があります:

def isPrimeNumber(n):
    print "isPrimeNumber({}) call".format(n)
    if n==1:
        return False
    for x in range(2,n):
        if n % x == 0:
            return False
    return True

def primes (n=1):
    while(True):
        print "loop step ---------------- {}".format(n)
        if isPrimeNumber(n): yield n
        n += 1

for n in primes():
    if n> 10:break
    print "wiriting result {}".format(n)

出力:

loop step ---------------- 1
isPrimeNumber(1) call
loop step ---------------- 2
isPrimeNumber(2) call
loop step ---------------- 3
isPrimeNumber(3) call
wiriting result 3
loop step ---------------- 4
isPrimeNumber(4) call
loop step ---------------- 5
isPrimeNumber(5) call
wiriting result 5
loop step ---------------- 6
isPrimeNumber(6) call
loop step ---------------- 7
isPrimeNumber(7) call
wiriting result 7
loop step ---------------- 8
isPrimeNumber(8) call
loop step ---------------- 9
isPrimeNumber(9) call
loop step ---------------- 10
isPrimeNumber(10) call
loop step ---------------- 11
isPrimeNumber(11) call

I am not a Python developer, but it looks to me yield holds the position of program flow and the next loop start from "yield" position. It seems like it is waiting at that position, and just before that, returning a value outside, and next time continues to work.

It seems to be an interesting and nice ability :D




Like every answer suggests, yield is used for creating a sequence generator. It's used for generating some sequence dynamically. For example, while reading a file line by line on a network, you can use the yield function as follows:

def getNextLines():
   while con.isOpen():
       yield con.read()

You can use it in your code as follows:

for line in getNextLines():
    doSomeThing(line)

Execution Control Transfer gotcha

The execution control will be transferred from getNextLines() to the for loop when yield is executed. Thus, every time getNextLines() is invoked, execution begins from the point where it was paused last time.

Thus in short, a function with the following code

def simpleYield():
    yield "first time"
    yield "second time"
    yield "third time"
    yield "Now some useful value {}".format(12)

for i in simpleYield():
    print i

will print

"first time"
"second time"
"third time"
"Now some useful value 12"



In summary, the yield statement transforms your function into a factory that produces a special object called a generator which wraps around the body of your original function. When the generator is iterated, it executes your function until it reaches the next yield then suspends execution and evaluates to the value passed to yield . It repeats this process on each iteration until the path of execution exits the function. 例えば、

def simple_generator():
    yield 'one'
    yield 'two'
    yield 'three'

for i in simple_generator():
    print i

simply outputs

one
two
three

The power comes from using the generator with a loop that calculates a sequence, the generator executes the loop stopping each time to 'yield' the next result of the calculation, in this way it calculates a list on the fly, the benefit being the memory saved for especially large calculations

Say you wanted to create a your own range function that produces an iterable range of numbers, you could do it like so,

def myRangeNaive(i):
    n = 0
    range = []
    while n < i:
        range.append(n)
        n = n + 1
    return range

and use it like this;

for i in myRangeNaive(10):
    print i

But this is inefficient because

  • You create an array that you only use once (this wastes memory)
  • This code actually loops over that array twice! :(

Luckily Guido and his team were generous enough to develop generators so we could just do this;

def myRangeSmart(i):
    n = 0
    while n < i:
       yield n
       n = n + 1
    return

for i in myRangeSmart(10):
    print i

Now upon each iteration a function on the generator called next() executes the function until it either reaches a 'yield' statement in which it stops and 'yields' the value or reaches the end of the function. In this case on the first call, next() executes up to the yield statement and yield 'n', on the next call it will execute the increment statement, jump back to the 'while', evaluate it, and if true, it will stop and yield 'n' again, it will continue that way until the while condition returns false and the generator jumps to the end of the function.




(My below answer only speaks from the perspective of using Python generator, not the underlying implementation of generator mechanism , which involves some tricks of stack and heap manipulation.)

When yield is used instead of a return in a python function, that function is turned into something special called generator function . That function will return an object of generator type. The yield keyword is a flag to notify the python compiler to treat such function specially. Normal functions will terminate once some value is returned from it. But with the help of the compiler, the generator function can be thought of as resumable. That is, the execution context will be restored and the execution will continue from last run. Until you explicitly call return, which will raise a StopIteration exception (which is also part of the iterator protocol), or reach the end of the function. I found a lot of references about generator but this one from the functional programming perspective is the most digestable.

(Now I want to talk about the rationale behind generator , and the iterator based on my own understanding. I hope this can help you grasp the essential motivation of iterator and generator. Such concept shows up in other languages as well such as C#.)

As I understand, when we want to process a bunch of data, we usually first store the data somewhere and then process it one by one. But this intuitive approach is problematic. If the data volume is huge, it's expensive to store them as a whole beforehand. So instead of storing the data itself directly, why not store some kind of metadata indirectly, ie the logic how the data is computed .

There are 2 approaches to wrap such metadata.

  1. The OO approach, we wrap the metadata as a class . This is the so-called iterator who implements the iterator protocol (ie the __next__() , and __iter__() methods). This is also the commonly seen iterator design pattern .
  2. The functional approach, we wrap the metadata as a function . This is the so-called generator function . But under the hood, the returned generator object still IS-A iterator because it also implements the iterator protocol.

Either way, an iterator is created, ie some object that can give you the data you want. The OO approach may be a bit complex. Anyway, which one to use is up to you.




The yield keyword simply collects returning results. Think of yield like return +=




Yet another TL;DR

Iterator on list : next() returns the next element of the list

Iterator generator : next() will compute the next element on the fly (execute code)

You can see the yield/generator as a way to manually run the control flow from outside (like continue loop one step), by calling next , however complex the flow.

注意:ジェネレータは通常の機能ではありません。ローカル変数(スタック)のような以前の状態を覚えています。詳細な説明については、他の回答または記事を参照してください。ジェネレータは一度しか反復できません。あなたはせずyieldにやることもできますが、それほど素晴らしくないので、「非常に良い」言語砂糖と考えることができます。




Related