ado.net - Double value changed after transmitting it via .NET remoting -
i have application uses ado.net datasets , data adapters in combination remoting (client/server architecture, transmitting datasets via remoting).
i face following issue:
tl;dr: double value 44.850000925362000
turns 44.850000925362004
after sending dataset via remoting server.
i create new row in database saving dataset, contains float column (mapped double in dataset). double value saved 44.850000925362
i read row database (dataadapter.fill
) , same value (checked bitconverter.doubletoint64
). dataset passes via remoting client , merged usecase dataset on client. still retaining same value.
this dataset merged usecase-dataset, row imported different table (because read view, saved table) , value changed before transmitting usecase-dataset (now containing row in other table).
on client-side value still same - dataset reaches server, value in question different (although no changes made specific column - still unchanged
, original value different).
example: save 44.850000925362000
read 44.850000925362000
merge, import, modify row - still 44.850000925362000
send server saving, 44.850000925362004
on server!
...which causes concurrencyexception
because record saved 44.850000925362000
- data adapter update uses 44.850000925362004
in where
condition (optimistic concurrency).
nobody else touched row in between.
update
i tried setting test server , works fine there. funny thing is: same assembly works fine if use in different service. can't find in config or startup explain this. i'm using binary formatter on both, both .net 4.0, both use same sourcecode... 1 behaves different other.
further update
i captured sql statement being executed update. if run parameters where
clause in select
statement, fetches correct record. when manually (via sql management studio), accepts small delta between value in row , value give condition. still, doesn't work @ when running update via adapter.
anyway, i've given up. i've resorted rounding 5 digits - way more precision need in usecase anyway. might yield weird results if number gets large don't expect in use case (we're talking weight in kilograms).
i can tell happening here. not sure why, though:
if interpret bits of 44.850000925362000
, 44.8500009253620004
int64 values of 4631508893792468976
, 4631508893792468977
.
as can see, second value first 1 incremented one.
so, looks someone, somewhere interprets double value int64 , increments - maybe indicate new version of row, because modified it.
Comments
Post a Comment