parallel processing - python fcntl does not acquire lock -


i have written code write parallel in csv file in python. when program gets over, see few lines merged instead of in seperate lines. each line should contain 3 columns. instead shows below

eg

 myname  myage  myvalue   myname  myage  myvaluemyname  myname  myage  myvalue   myage 

what understood reading few other questions, need lock file if want avoid such scenarios. added fcntl module. seems file still not being locked produces similar output

my code

def getdata(x):     try:     # data api         c.writefile(x,x1,x2) except exception,err:     print err  class credits:     def __init__(self):         self.d = dict()         self.details = dict()         self.filename = "abc.csv"         self.fileopen = open(self.filename,"w")      def acquire(self):         fcntl.flock (self.fileopen, fcntl.lock_ex)      def release(self):         fcntl.flock(self.fileopen, fcntl.lock_un)      def __del__(self):         self.fileopen.close()      def writefile(self,x,x1,x2,x3):         try:             self.acquire()             self.fileopen.write(str(x)+","+str(x1)+","+str(x2)+"\n")         except exception, e:             raise e         finally:             self.release() if __name__ == '__main__':     conn = psycopg2.connect()     curr = conn.cursor(cursor_factory=psycopg2.extras.dictcursor)     curr.execute("select * emp")     rows = curr.fetchall()      listdata = []     each in rows:         listdata.append(each[0])      c = credits()     p = pool(processes = 5)     results = p.map(getdata,listdata)     conn.close() 

i had declare getdata top level function otherwise gave me "cant pickle function"

why don't write multiple files in each separate process , merge them? might more computationally expensive ensure thread safety.


Comments

Popular posts from this blog

php - failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request -

java - How to filter a backspace keyboard input -

java - Show Soft Keyboard when EditText Appears -