javascript - Long load times on website when running R script -
i'm attempting query mysql database on webpage. within r script, have 4 different "query" functions along multiple calculations display statistical graphs webpage, dependent on "n" variable. i'm using php (using shell_exec) call r , send "n". i'm using rmysql & ggplot2 libraries in r.
running r script 1 basic query function (includes dbconnect(), dbgetquery , on.exit(dbdisconnect()), using png(), plot(), , dev.off() takes ~15 seconds display graph on website.
with 2 functions , 2 plots, haven't had patience wait out see if works since load time long. queries rather lengthy (could made easier through looping), i've tested them work through mysql , i'm not sure how avoid loop errors sql.
could long loading time due having dbconnect/dbdisconnect in each individual function? should once in script (i.e. create new "connect" function , call other functions here)?
is fact i'm running multiple , lengthy query requests? if that's case, better if split each "query function" individual r scripts, "shell_exec" each , allow user select graphs display (i.e. check boxes in html/php allow execution of each script/graph desired)?
through testing, know logic there, might totally missing something. speed process website user doesn't have stare @ loading screen forever , can tangible results.
sorry lengthy request, appreciate can give! if you'd see webpage or of code better idea, can upload , share.
thanks!
edit: should noted i'm using while loop (x < 100) of calculations; know loops in r typically known expensive processes whole vectoring thing (i think that's name?) on head.
your requests demanding , cannot executed synchronously. instead use queue system. when request made, send queue. results output asynchronously when server ready. in meantime, can redirect user page , use made aware of when results available.
here suggestions:
Comments
Post a Comment