PHP 中的异步函数调用

我的工作是一个 PHP 网络应用程序,我需要执行一些网络操作的请求,如从远程服务器的人根据用户的要求。

是否可以在 PHP 中模拟异步行为,因为我必须将一些数据传递给函数,并且还需要从函数中输出。

我的代码是这样的:

<?php


$data1 = processGETandPOST();
$data2 = processGETandPOST();
$data3 = processGETandPOST();


$response1 = makeNetworkCall($data1);
$response2 = makeNetworkCall($data2);
$response3 = makeNetworkCall($data3);


processNetworkResponse($response1);
processNetworkResponse($response2);
processNetworkResponse($response3);


/*HTML and OTHER UI STUFF HERE*/


exit;
?>

每个网络操作大约需要5秒钟才能完成,如果我发出3个请求,那么我的应用程序的响应时间总共要增加15秒。

MakeNetworkCall ()函数只是执行一个 HTTPPOST 请求。

远程服务器是第三方 API,所以我在那里没有任何控制权。

附注: 请不要回答给出关于 AJAX 或其他东西的建议。我目前正在寻找,如果我可以做到这一点,通过 PHP 可能是一个 C + + 扩展或类似的东西。

256490 次浏览

cURL is going to be your only real choice here (either that, or using non-blocking sockets and some custom logic).

This link should send you in the right direction. There is no asynchronous processing in PHP, but if you're trying to make multiple simultaneous web requests, cURL multi will take care of that for you.

I think if the HTML and other UI stuff needs the data returned then there is not going to be a way to async it.

I believe the only way to do this in PHP would be to log a request in a database and have a cron check every minute, or use something like Gearman queue processing, or maybe exec() a command line process

In the meantime you php page would have to generate some html or js that makes it reload every few seconds to check on progress, not ideal.

To sidestep the issue, how many different requests are you expecting? Could you download them all automatically every hour or so and save to a database?

I dont have a direct answer, but you might want to look into these things:

There is also http v2 which is a wrapper for curl. Can be installed via pecl.

http://devel-m6w6.rhcloud.com/mdref/http/

Nowadays, it's better to use queues than threads (for those who don't use Laravel there are tons of other implementations out there like this).

The basic idea is, your original PHP script puts tasks or jobs into a queue. Then you have queue job workers running elsewhere, taking jobs out of the queue and starts processing them independently of the original PHP.

The advantages are:

  1. Scalability - you can just add worker nodes to keep up with demand. In this way, tasks are run in parallel.
  2. Reliability - modern queue managers such as RabbitMQ, ZeroMQ, Redis, etc, are made to be extremely reliable.

I think some code about the cURL solution is needed here, so I will share mine (it was written mixing several sources as the PHP Manual and comments).

It does some parallel HTTP requests (domains in $aURLs) and print the responses once each one is completed (and stored them in $done for other possible uses).

The code is longer than needed because the realtime print part and the excess of comments, but feel free to edit the answer to improve it:

<?php
/* Strategies to avoid output buffering, ignore the block if you don't want to print the responses before every cURL is completed */
ini_set('output_buffering', 'off'); // Turn off output buffering
ini_set('zlib.output_compression', false); // Turn off PHP output compression
//Flush (send) the output buffer and turn off output buffering
ob_end_flush(); while (@ob_end_flush());
apache_setenv('no-gzip', true); //prevent apache from buffering it for deflate/gzip
ini_set('zlib.output_compression', false);
header("Content-type: text/plain"); //Remove to use HTML
ini_set('implicit_flush', true); // Implicitly flush the buffer(s)
ob_implicit_flush(true);
header('Cache-Control: no-cache'); // recommended to prevent caching of event data.
$string=''; for($i=0;$i<1000;++$i){$string.=' ';} output($string); //Safari and Internet Explorer have an internal 1K buffer.
//Here starts the program output


function output($string){
ob_start();
echo $string;
if(ob_get_level()>0) ob_flush();
ob_end_clean();  // clears buffer and closes buffering
flush();
}


function multiprint($aCurlHandles,$print=true){
global $done;
// iterate through the handles and get your content
foreach($aCurlHandles as $url=>$ch){
if(!isset($done[$url])){ //only check for unready responses
$html = curl_multi_getcontent($ch); //get the content
if($html){
$done[$url]=$html;
if($print) output("$html".PHP_EOL);
}
}
}
};


function full_curl_multi_exec($mh, &$still_running) {
do {
$rv = curl_multi_exec($mh, $still_running); //execute the handles
} while ($rv == CURLM_CALL_MULTI_PERFORM); //CURLM_CALL_MULTI_PERFORM means you should call curl_multi_exec() again because there is still data available for processing
return $rv;
}


set_time_limit(60); //Max execution time 1 minute


$aURLs = array("http://domain/script1.php","http://domain/script2.php");  // array of URLs


$done=array();  //Responses of each URL


//Initialization
$aCurlHandles = array(); // create an array for the individual curl handles
$mh = curl_multi_init(); // init the curl Multi and returns a new cURL multi handle
foreach ($aURLs as $id=>$url) { //add the handles for each url
$ch = curl_init(); // init curl, and then setup your options
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1); // returns the result - very important
curl_setopt($ch, CURLOPT_HEADER, 0); // no headers in the output
$aCurlHandles[$url] = $ch;
curl_multi_add_handle($mh,$ch);
}


//Process
$active = null; //the number of individual handles it is currently working with
$mrc=full_curl_multi_exec($mh, $active);
//As long as there are active connections and everything looks OK…
while($active && $mrc == CURLM_OK) { //CURLM_OK means is that there is more data available, but it hasn't arrived yet.
// Wait for activity on any curl-connection and if the network socket has some data…
if($descriptions=curl_multi_select($mh,1) != -1) {//If waiting for activity on any curl_multi connection has no failures (1 second timeout)
usleep(500); //Adjust this wait to your needs
//Process the data for as long as the system tells us to keep getting it
$mrc=full_curl_multi_exec($mh, $active);
//output("Still active processes: $active".PHP_EOL);
//Printing each response once it is ready
multiprint($aCurlHandles);
}
}


//Printing all the responses at the end
//multiprint($aCurlHandles,false);


//Finalize
foreach ($aCurlHandles as $url=>$ch) {
curl_multi_remove_handle($mh, $ch); // remove the handle (assuming  you are done with it);
}
curl_multi_close($mh); // close the curl multi handler
?>

One way is to use pcntl_fork() in a recursive function.

function networkCall(){
$data = processGETandPOST();
$response = makeNetworkCall($data);
processNetworkResponse($response);
return true;
}


function runAsync($times){
$pid = pcntl_fork();
if ($pid == -1) {
die('could not fork');
} else if ($pid) {
// we are the parent
$times -= 1;
if($times>0)
runAsync($times);
pcntl_wait($status); //Protect against Zombie children
} else {
// we are the child
networkCall();
posix_kill(getmypid(), SIGKILL);
}
}


runAsync(3);

One thing about pcntl_fork() is that when running the script by way of Apache, it doesn't work (it's not supported by Apache). So, one way to resolve that issue is to run the script using the php cli, like: exec('php fork.php',$output); from another file. To do this you'll have two files: one that's loaded by Apache and one that's run with exec() from inside the file loaded by Apache like this:

apacheLoadedFile.php

exec('php fork.php',$output);

fork.php

function networkCall(){
$data = processGETandPOST();
$response = makeNetworkCall($data);
processNetworkResponse($response);
return true;
}


function runAsync($times){
$pid = pcntl_fork();
if ($pid == -1) {
die('could not fork');
} else if ($pid) {
// we are the parent
$times -= 1;
if($times>0)
runAsync($times);
pcntl_wait($status); //Protect against Zombie children
} else {
// we are the child
networkCall();
posix_kill(getmypid(), SIGKILL);
}
}


runAsync(3);

This old question has a new answer. There are now a few "async" solutions for PHP these days (which are equivalent to Python's multiprocess in the sense that they spawn new independent PHP processes rather than manage it at the framework level)

The two solutions I have seen are

Give them a try!

You cannot make an asynchronous process directly through php, but you can do the processing indirectly through the system planning mechanism. You can use crontab. If you send a process to crontab, it can make it scheduled for you.

This is quite an old post but still to help people with Magento 1, Magento 2 & PHP, here are few answers.

If you are using Magento 1 then running code asynchronously can be easily done by setting up a cron job or managing your code via queues just like message queues which in turn would require cron setup.

If you are using Magento 2 then it depends which version of Magento 2 you are using,

  1. For versions older than 2.3.3 you can refer https://devdocs.magento.com/guides/v2.4/extension-dev-guide/message-queues/async-message-queue-config-files.html
  2. For versions after 2.3.3 you can refer https://devdocs.magento.com/guides/v2.4/rest/asynchronous-web-endpoints.html and https://devdocs.magento.com/guides/v2.4/extension-dev-guide/async-operations.html

If you want to go with PHP or none of the above references work for you or you feel like you need a easier way out then you can refer https://www.php.net/manual/en/function.shell-exec.php#118495

Example:

<?php
    

namespace Vendor\Module\Controller\ControllerNameFolder;


class YourCustomAsyncAction extends \Magento\Framework\App\Action\Action
{
/**
* resultJsonFactory
*
* @var \Magento\Framework\Controller\Result\JsonFactory
*/
protected $resultJsonFactory;
/**
* _urlInterface
*
* @var \Magento\Framework\UrlInterface
*/
protected $_urlInterface;
        

/**
* __construct
*
* @param \Magento\Framework\App\Action\Context $context
* @param \Magento\Framework\Controller\Result\JsonFactory $resultJsonFactory
* @return void
*/
public function __construct(
\Magento\Framework\App\Action\Context $context,
\Magento\Framework\Controller\Result\JsonFactory $resultJsonFactory,
\Magento\Framework\UrlInterface $urlInterface
)
{
parent::__construct($context);
$this->resultJsonFactory = $resultJsonFactory;
$this->_urlInterface = $urlInterface;
$this->location = $location;
}
    

public function execute()
{
$result = $this->resultJsonFactory->create();
$params = $this->getRequest()->getParams();
/* prepare your URL */
$url = $this->_urlInterface->getUrl('module_route/controllername/actionname',$params);
/* prepare your URL */


/* async code */
shell_exec("wget $url>/dev/null >/dev/null &");
/* async code */
        

return $result->setData(['success'=>true]);
}
}

If you don't want it to be a background process, the simplest solution is proc_open(). It executes terminal commands without waiting for them to finish. You can use it to call your script; Something like this:

proc_open('start cmd.exe php path-to-script.php', [], $pipes);

You can also have it to run inline php code using -r flag. Read the help of PHP command to see all available flags.


Another way is to send an HTTP request to yourself, ignoring the timeout warning! You can simply implement the idea using php file_get_contents():

$stream_context = stream_context_create(['http'=>['timeout' => 1]]);
file_get_contents('http://localhost/script-to-run-async.php', false, $stream_context);


// Replace "http://localhost/path-to-script.php" with
// a script that should be run asynchronously.

After all you can ignore the warning returned by the timeout using this answer.

Theres a way to call php function Asynchronous.

  1. serialize class, method and params to file
     $callback = [
    'class' => $class//this must be string class name,
    'method' => $method//this must be string method name,
    'params' => $params,
    ];
    file_put_contents($file, addslashes(serialize($callback)) . "\n");
    
    
    //call async.php file like
    shell_exec('php async.php >> /tmp/log/async.std &');
    
  2. make serialized data caller - lets say that name is async.php
    $fileContent = file_get_contents($file);
    $callback = unserialize(stripslashes($serialized_row));
    
  3. make class method caller
     $callback_class = $callback['class'];
    $callback_method = $callback['method'];
    $callback_params = (array)$callback['params'];
    
    
    $reflection = new \ReflectionClass($callback_class);
    $is_static = (new \ReflectionMethod($callback_class, $callback_method))->isStatic();
    if ($is_static) {
    
    
    call_user_func_array([$callback_class, $callback_method], $callback_params);
    }
    elseif ($callback_method == '__construct') {
    
    
    $reflection->newInstanceArgs($callback_params);
    }
    else {
    
    
    $reflectionMethod = new \ReflectionMethod($callback_class, $callback_method);
    $reflectionMethod->invokeArgs(new $callback_class(), $callback_params);
    }