表单提交后,如何在后台运行 PHP 脚本?

问题
我有一个表单,当提交时,将运行基本代码来处理提交的信息,并将其插入数据库,以便在通知网站上显示。此外,我还有一个通过电子邮件和短信接收这些通知的注册用户名单。这个列表是微不足道的时刻(只有接近150) ,但它足以导致它需要一分钟以上的周期,通过整个表的订阅者和发出150多个电子邮件。(由于大量电子邮件的政策,我们的电子邮件服务器的系统管理员正在按要求单独发送电子邮件。)

在此期间,发布警告的个人将坐在表单的最后一页上几乎一分钟,没有任何积极的强化,他们的通知正在张贴。这会导致其他潜在的问题,所有可能的解决方案,我觉得都不是理想的。

  1. 首先,发布者可能认为服务器滞后,再次点击“提交”按钮,导致脚本重新启动或运行两次。我可以解决这个问题,使用 JavaScript 禁用按钮,并替换文本来说类似于“ Processing...”,但这不是理想的,因为用户仍然会停留在页面上的脚本执行的长度。(另外,如果禁用了 JavaScript,这个问题仍然存在。)

  2. 其次,海报可能会在提交表单后过早地关闭标签或浏览器。脚本会一直在服务器上运行,直到它尝试回写到浏览器,但是如果用户然后浏览到我们域中的任何页面(当脚本仍在运行时) ,浏览器会挂起加载页面,直到脚本结束。(只有在关闭浏览器的选项卡或窗口而不是整个浏览器应用程序时才会发生这种情况。)尽管如此,这还是不够理想。

(可能)解决方案
我已经决定,我想把脚本的“电子邮件”部分分解成一个单独的文件,我可以在通知发布后调用。我最初想把这个确认页面后,通知已成功张贴。但是,用户不会知道这个脚本正在运行,任何异常对他们来说都不明显; 这个脚本不能失败。

但是,如果我可以将这个脚本作为后台进程运行呢?因此,我的问题是: 如何执行 PHP 脚本作为后台服务触发器,并且完全独立于用户在表单级别的操作来运行?

编辑: 这个 不能被破坏了。它必须在提交表单的同时运行。这些是高优先级通知。此外,运行我们的服务器的系统管理员不允许 crons 运行的频率超过5分钟。

165757 次浏览

Background cron job sounds like a good idea for this.

You'll need ssh access to the machine to run the script as a cron.

$ php scriptname.php to run it.

Assuming you are running on a *nix platform, use cron and the php executable.

EDIT:

There are quite a number of questions asking for "running php without cron" on SO already. Here's one:

Schedule scripts without using CRON

That said, the exec() answer above sounds very promising :)

How about this?

  1. Your PHP script that holds the form saves a flag or some value into a database or file.
  2. A second PHP script polls for this value periodically and if it's been set, it triggers the Email script in a synchronous manner.

This second PHP script should be set to run as a cron.

PHP exec("php script.php") can do it.

From the Manual:

If a program is started with this function, in order for it to continue running in the background, the output of the program must be redirected to a file or another output stream. Failing to do so will cause PHP to hang until the execution of the program ends.

So if you redirect the output to a log file (what is a good idea anyways), your calling script will not hang and your email script will run in bg.

And why not making a HTTP Request on the script and ignoring the response ?

http://php.net/manual/en/function.httprequest-send.php

If you make your request on the script you need to call your webserver will run it in background and you can (in your main script) show a message telling the user that the script is running.

On Linux/Unix servers, you can execute a job in the background by using proc_open:

$descriptorspec = array(
array('pipe', 'r'),               // stdin
array('file', 'myfile.txt', 'a'), // stdout
array('pipe', 'w'),               // stderr
);


$proc = proc_open('php email_script.php &', $descriptorspec, $pipes);

The & being the important bit here. The script will continue even if the original script has ended.

As I know you cannot do this in easy way (see fork exec etc (don't work under windows)), may be you can reverse the approach, use the background of the browser posting the form in ajax, so if the post still work you've no wait time.
This can help even if you have to do some long elaboration.

About sending mail it's always suggest to use a spooler, may be a local & quick smtp server that accept your requests and the spool them to the real MTA or put all in a DB, than use a cron that spool the queue.
The cron may be on another machine calling the spooler as external url:

* * * * * wget -O /dev/null http://www.example.com/spooler.php

If you can access the server over ssh and can run your own scripts you can make a simple fifo server using php (although you will have to recompile php with posix support for fork).

The server can be written in anything really, you probably can easily do it in python.

Or the simplest solution would be sending an HttpRequest and not reading the return data but the server might destroy the script before it finish processing.

Example server :

<?php
define('FIFO_PATH', '/home/user/input.queue');
define('FORK_COUNT', 10);


if(file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' exists, please delete it and try again.' . "\n");
}


if(!file_exists(FIFO_PATH) && !posix_mkfifo(FIFO_PATH, 0666)){
die('Couldn\'t create the listening fifo.' . "\n");
}


$pids = array();
$fp = fopen(FIFO_PATH, 'r+');
for($i = 0; $i < FORK_COUNT; ++$i) {
$pids[$i] = pcntl_fork();
if(!$pids[$i]) {
echo "process(" . posix_getpid() . ", id=$i)\n";
while(true) {
$line = chop(fgets($fp));
if($line == 'quit' || $line === false) break;
echo "processing (" . posix_getpid() . ", id=$i) :: $line\n";
//  $data = json_decode($line);
//  processData($data);
}
exit();
}
}
fclose($fp);
foreach($pids as $pid){
pcntl_waitpid($pid, $status);
}
unlink(FIFO_PATH);
?>

Example client :

<?php
define('FIFO_PATH', '/home/user/input.queue');
if(!file_exists(FIFO_PATH)) {
die(FIFO_PATH . ' doesn\'t exist, please make sure the fifo server is running.' . "\n");
}


function postToQueue($data) {
$fp = fopen(FIFO_PATH, 'w+');
stream_set_blocking($fp, false); //don't block
$data = json_encode($data) . "\n";
if(fwrite($fp, $data) != strlen($data)) {
echo "Couldn't the server might be dead or there's a bug somewhere\n";
}
fclose($fp);
}
$i = 1000;
while(--$i) {
postToQueue(array('xx'=>21, 'yy' => array(1,2,3)));
}
?>

Doing some experimentation with exec and shell_exec I have uncovered a solution that worked perfectly! I choose to use shell_exec so I can log every notification process that happens (or doesn't). (shell_exec returns as a string and this was easier than using exec, assigning the output to a variable and then opening a file to write to.)

I'm using the following line to invoke the email script:

shell_exec("/path/to/php /path/to/send_notifications.php '".$post_id."' 'alert' >> /path/to/alert_log/paging.log &");

It is important to notice the & at the end of the command (as pointed out by @netcoder). This UNIX command runs a process in the background.

The extra variables surrounded in single quotes after the path to the script are set as $_SERVER['argv'] variables that I can call within my script.

The email script then outputs to my log file using the >> and will output something like this:

[2011-01-07 11:01:26] Alert Notifications Sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 38.71 seconds)
[2011-01-07 11:01:34] CRITICAL ERROR: Alert Notifications NOT sent for http://alerts.illinoisstate.edu/2049 (SCRIPT: 23.12 seconds)

The simpler way to run a PHP script in background is

php script.php >/dev/null &

The script will run in background and the page will also reach the action page faster.

If you're on Windows, research proc_open or popen...

But if we're on the same server "Linux" running cpanel then this is the right approach:

#!/usr/bin/php
<?php
$pid = shell_exec("nohup nice php -f
'path/to/your/script.php' /dev/null 2>&1 & echo $!");
While(exec("ps $pid"))
{ //you can also have a streamer here like fprintf,
// or fgets
}
?>

Don't use fork() or curl if you doubt you can handle them, it's just like abusing your server

Lastly, on the script.php file which is called above, take note of this make sure you wrote:

<?php
ignore_user_abort(TRUE);
set_time_limit(0);
ob_start();
// <-- really optional but this is pure php


//Code to be tested on background


ob_flush(); flush();
//this two do the output process if you need some.
//then to make all the logic possible




str_repeat(" ",1500);
//.for progress bars or loading images


sleep(2); //standard limit


?>

In my case I have 3 params, one of them is string (mensaje):

exec("C:\wamp\bin\php\php5.5.12\php.exe C:/test/N/trunk/api/v1/Process.php $idTest2 $idTest3 \"$mensaje\" >> c:/log.log &");

In my Process.php I have this code:

if (!isset($argv[1]) || !isset($argv[2]) || !isset($argv[3]))
{
die("Error.");
}


$idCurso = $argv[1];
$idDestino = $argv[2];
$mensaje = $argv[3];

for background worker i think you should try this technique it will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.

form_action_page.php

     <?php


post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
//post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
//post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
//call as many as pages you like all pages will run at once //independently without waiting for each page response as asynchronous.


//your form db insertion or other code goes here do what ever you want //above code will work as background job this line will direct hit before //above lines response
?>
<?php


/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to     finish running.
*
*/
function post_async($url,$params)
{


$post_string = $params;


$parts=parse_url($url);


$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);


$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>

testpage.php

    <?
echo $_REQUEST["Keywordname"];//case1 Output > testValue
//here do your background operations it will not halt main page
?>

PS:if you want to send url parameters as loop then follow this answer :https://stackoverflow.com/a/41225209/6295712

Of all the answers, none considered the ridiculously easy fastcgi_finish_request function, that when called, flushes all remaining output to the browser and closes the Fastcgi session and the HTTP connection, while letting the script run in the background.

Example:

<?php
header('Content-Type: application/json');
echo json_encode(['ok' => true]);
fastcgi_finish_request(); // The user is now disconnected from the script


// Do stuff with received data

Note: Due to a wontfix quirk in which calling flush() after fastcgi_finish_request will cause it to exit without warning/error.

You may wish to call ignore_user_abort(true) beforehand to supress this behavior, or simply avoid calling flush() after you've intentionally closed the connection :)

$connected = true;


// Stuff...


fastcgi_finish_request();
$connected = false;


// ...
if ($connected) {
flush();
}

Or

ignore_user_abort(true);
fastcgi_finish_request();
// Accidental flush()es won't do harm (even if they're still technically a bug)
flush();

This is works for me. tyr this

exec(“php asyn.php”.” > /dev/null 2>/dev/null &“);

Use Amphp to execute jobs in parallel & asynchronously.

Install the library

composer require amphp/parallel-functions

Code sample

<?php


require "vendor/autoload.php";


use Amp\Promise;
use Amp\ParallelFunctions;




echo 'started</br>';


$promises[1] = ParallelFunctions\parallel(function (){
// Send Email
})();


$promises[2] = ParallelFunctions\parallel(function (){
// Send SMS
})();




Promise\wait(Promise\all($promises));


echo 'finished';

Fo your use case, You can do something like below

<?php


use function Amp\ParallelFunctions\parallelMap;
use function Amp\Promise\wait;


$responses = wait(parallelMap([
'a@example.com',
'b@example.com',
'c@example.com',
], function ($to) {
return send_mail($to);
}));