Converting a large set of data into C# objects -


i'm making complex app (planning involves: articles, sales, clients, manufacturing, machines...) uses information provided sql server database of erp.

i use 30 different related objects, each of has info stored in table/view. of tables have 20k 100k records.

i need convert these tables c# object future processing cannot handled in sql. not need rows, there ins't way determine ones need exactly, depend on runtime events.

the question best way this. have tried following approaches:

  1. retrieve data , store in dataset using sqldataadapter, ocuppies 300mb in ram. first problem here: sync, it's admissable since data isn't going change during execution.

    then ran through every row , convert to c# objects, stored in static dictionaries fast access through key. problem creating many objects (millions) takes memory usage 1,4gb, much. besides memory, data access fast.

so if getting takes memory, thought needed kind of laxy loading, tried:

  1. another option have considered query directly database through sqldatareader filtering item need only first time it's required, it's stored in static dictionary. way memory usage it's minimum, way slow (minutes order) means need make millon different queries server doesn't seem (low performance).

lastly, tried intermediate approach kind of works, i'm not sure if it's optimal, suspect it's not:

  1. a third option fill dataset containing info , store local static copy, not convert rows objects, on demand (lazy), this:

    public class productoterminado : articulo     {    private static dictionary<string, productoterminado> productosterminados = new dictionary<string, productoterminado>();    public pinturatipo pinturatipo { get; set; }   public productoterminado(string id)       : base(id) { }   public static productoterminado obtener(string idarticulo)   {       idarticulo = idarticulo.toupper();       if (productosterminados.containskey(idarticulo))       {         return productosterminados[idarticulo];       }       else       {           productoterminado productoterminado = new productoterminado(idarticulo);           //this new data static dataset           var fila = datos.bd.tables["articulos"].select("idarticulo = '" + idarticulo + "'").first();           //then fill object , add dictionary.           productoterminado.descripcion = fila["descripcion"].tostring();           productoterminado.paletizacion = convert.toint32(fila["cantidadbulto"]);           productoterminado.pinturatipo = pinturatipo.obtener(fila["pt"].tostring());           productosterminados.add(idarticulo, productoterminado);           return productoterminado;       }    } } 

so, way proceed or should entity framework or typed dataset?

i use relations between 30 different objects, each of has info stored in table/view. of tables have 20k 100k records.

i suggest making different decision different types of objects. usually, tables have thousands of records more change. tables have fewer records less likely. in project working on decision cache in list<t> objects don't change (on start-up). few hundred instances should take less second.

if using linq-to-sql, have object local in list<t> , have correctly defined fk constraints, can obj.items access items table filtered obj's id. (in example obj pk , items fk table).

this design give users performance expect. when working on small sets instantaneous (cached). when working on larger sets making small selects or inserts - performance (quick queries use pk). suffer when start doing queries join multiple big tables; , in cases, users expect (though can't without knowing more use case).


Comments

Popular posts from this blog

php - failed to open stream: HTTP request failed! HTTP/1.0 400 Bad Request -

java - How to filter a backspace keyboard input -

java - Show Soft Keyboard when EditText Appears -