在 Python 中枚举是懒惰的吗?

我想知道当我将一个生成器函数的结果传递给 python 的枚举()时会发生什么:

def veryBigHello():
i = 0
while i < 10000000:
i += 1
yield "hello"


numbered = enumerate(veryBigHello())
for i, word in numbered:
print i, word
    

枚举是以懒惰方式迭代,还是首先将所有内容输入到 <enumerate object>中?我有99.999% 的把握它是惰性的,所以我可以把它完全当作生成器函数来处理吗? 还是我需要注意什么?

44408 次浏览

It's lazy. It's fairly easy to prove that's the case:

>>> def abc():
...     letters = ['a','b','c']
...     for letter in letters:
...         print letter
...         yield letter
...
>>> numbered = enumerate(abc())
>>> for i, word in numbered:
...     print i, word
...
a
0 a
b
1 b
c
2 c

Since you can call this function without getting out of memory exceptions it definitly is lazy

def veryBigHello():
i = 0
while i < 1000000000000000000000000000:
yield "hello"


numbered = enumerate(veryBigHello())
for i, word in numbered:
print i, word

It's even easier to tell than either of the previous suggest:

$ python
Python 2.5.5 (r255:77872, Mar 15 2010, 00:43:13)
[GCC 4.3.4 20090804 (release) 1] on cygwin
Type "help", "copyright", "credits" or "license" for more information.
>>> abc = (letter for letter in 'abc')
>>> abc
<generator object at 0x7ff29d8c>
>>> numbered = enumerate(abc)
>>> numbered
<enumerate object at 0x7ff29e2c>

If enumerate didn't perform lazy evaluation it would return [(0,'a'), (1,'b'), (2,'c')] or some (nearly) equivalent.

Of course, enumerate is really just a fancy generator:

def myenumerate(iterable):
count = 0
for _ in iterable:
yield (count, _)
count += 1


for i, val in myenumerate((letter for letter in 'abc')):
print i, val

Old school alternative since I was using a generator that someone else (sklearn) wrote that didn't work with the approaches here.

i=(-1)
for x in some_generator:
i+=1