结合像“承诺”这样的等待

在异步 JavaScript 中,很容易使用 Promise.all并行运行任务并等待所有任务完成:

async function bar(i) {
console.log('started', i);
await delay(1000);
console.log('finished', i);
}


async function foo() {
await Promise.all([bar(1), bar(2)]);
}


// This works too:
async function my_all(promises) {
for (let p of promises) await p;
}


async function foo() {
await my_all([bar(1), bar(2), bar(3)]);
}

我试图用 python 重写后者:

import asyncio


async def bar(i):
print('started', i)
await asyncio.sleep(1)
print('finished', i)


async def aio_all(seq):
for f in seq:
await f


async def main():
await aio_all([bar(i) for i in range(10)])


loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()

但它会按顺序执行我的任务。

等待多个等待时间的最简单方法是什么? 为什么我的方法行不通?

46370 次浏览

The equivalent would be using asyncio.gather:

import asyncio


async def bar(i):
print('started', i)
await asyncio.sleep(1)
print('finished', i)


async def main():
await asyncio.gather(*[bar(i) for i in range(10)])


loop = asyncio.get_event_loop()
loop.run_until_complete(main())
loop.close()

Why doesn't my approach work?

Because when you await each item in seq, you block that coroutine. So in essence, you have synchronous code masquerading as async. If you really wanted to, you could implement your own version of asyncio.gather using loop.create_task or asyncio.ensure_future.

EDIT

The original answer used the lower-level asyncio.wait.

I noticed that asyncio.gather() may be a better way to await other than asyncio.wait() if we want ordered results.

As the docs indicates, the order of result values from asyncio.gather() method corresponds to the order of awaitables in aws. However, the order of result values from asyncio.wait() won't do the same thing.You can test it.

https://docs.python.org/3/library/asyncio-task.html#asyncio.gather

asyncio.gather() will return the list of output from each async function calls.

import asyncio


async def bar(i):
print('started', i)
await asyncio.sleep(1)
print('finished', i)
return i


async def main():
values = await asyncio.gather(*[bar(i) for i in range(10)])
print(values)


asyncio.run(main())

This method, gather, takes arbitrary number of args for the concurrent jobs instead of a list, so we unpack.

It's very common to need this intermediate value, values in my eg, instead of designing your function/method to have side effects.

As of Python 3.11:

A more modern way to create and run tasks concurrently and wait for their completion is asyncio.TaskGroup.

... although, I couldn't find any reason why TaskGroup should be preferred over gather