Submit jobs to run in background using Perl -


this question has answer here:

i have perl script submitting hadoop command run in background, in while loop each record in input file. scenario , have fire hadoop commands unix , make run background, not wait background process finish , go next record in input file , fire next one. code below submits hadoop command in background , waits complete dont want to.

open(my $data, '<', $file) or die "could not open '$file' $!\n";  while (my $line = <$data>) {   chomp $line;   @fields = split "," , $line;   if( $fields[7] == 'y' ) { `nohup sqoop export --connect "jdbc:sqlserver://sqlserver:1433;database=$fields[0];user=sa;password=pwd" --table $fields[2] --export-dir $src_dir --input-fields-terminated-by '$fields[3]' --input-lines-terminated-by '$fields[4]' --m $fields[5] --staging-table $fields[6] --clear-staging-table --batch > $tgt_dir/$fields[2].out & `; echo $! > $pid_file; }  else {    next; } } 

please let me know how can it..

also,the hadoop command can run more few minutes. have make script wait background commands ,with of pid, fired , give report based on stdout files hadoop commands.

need on how accomplish this....

i think problem nohup [command] & inside perl isn't working intended. see here (something stdout. echo should inside backticks).

for example:

perl -e 'say `nohup sleep 30 & echo \$! > foo.pid &`' 

does not run in background me.

but:

perl -e 'say `nohup sleep 30 > /dev/null & echo \$! > foo.pid`' 

will work. having said comments saying should google more background processes in perl warranted. using fork() (or cpan module using it) , managing child processes there.


Comments

Popular posts from this blog

php - failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request -

java - How to filter a backspace keyboard input -

java - Show Soft Keyboard when EditText Appears -