When querying a data component, You need to convert the dataset into an object set. The results you see during the test are a bit unexpected. datareder fills in the object set ratio dataadapter. the efficiency of filling fill into dataset is high! Dataadapter. the internal working principle of fill is not understood, but the object set is strongly typed in the case of data conversion, while dataset is not a strong type. The principle is that filling dataset is superior in efficiency, the test results are the opposite!
Test Code As follows (obtain 830 records for all orders ):
System. Io. streamwriter write = new system. Io. streamwriter (@ "C: \ testreader.txt ");
Console. setout (write );
Using (system. data. sqlclient. sqlconnection conn = new system. data. sqlclient. sqlconnection ("Data Source = .; initial catalog = northwind; user id = sa; Pwd = ;"))
{
Xsf. xutils. xcounter counter = new xsf. xutils. xcounter ();
System. Data. sqlclient. sqlcommand cmd = new system. Data. sqlclient. sqlcommand ();
Cmd. commandtext = "select * from orders ";
Cmd. Connection = conn;
Conn. open ();
For (INT I = 0; I <20; I ++)
{
Counter. Start ();
Using (system. Data. idatareader reader = cmd. executereader ())
{
System. Collections. arraylist list = new system. Collections. arraylist ();
Counter. Start ();
While (reader. Read ())
{
Example. entitys. ordersojb order = new example. entitys. ordersojb ();
Order. loadinfo (Reader );
List. Add (Reader );
}
Counter. Stop ();
Console. Write ("orders:" + counter. Duration. tostring ("0.000000000000000000000000000" + "\ r \ n "));
}
Counter. Stop ();
System. Data. sqlclient. sqldataadapter da = new system. Data. sqlclient. sqldataadapter (CMD );
System. Data. dataset myds = new system. Data. dataset ();
Counter. Start ();
Da. Fill (myds );
Counter. Stop ();
Console. Write ("Dataset:" + counter. Duration. tostring ("0.000000000000000000000000000" + "\ r \ n "));
}
}
Write. Close ();
Ing object corresponding to the table
[Serializable]
Public class ordersojb
{
...
...
Public void loadinfo (system. Data. idatareader pdatareader)
{
If (pdatareader [0]! = Dbnull. value)
{
This. orderid = (int32) pdatareader [0];
}
If (pdatareader [1]! = Dbnull. value)
{
This. customerid = (string) pdatareader [1];
}
If (pdatareader [2]! = Dbnull. value)
{
This. employeeid = (int32) pdatareader [2];
}
If (pdatareader [3]! = Dbnull. value)
{
This. orderdate = (datetime) pdatareader [3];
}
If (pdatareader [4]! = Dbnull. value)
{
This. requireddate = (datetime) pdatareader [4];
}
If (pdatareader [5]! = Dbnull. value)
{
This. shippeddate = (datetime) pdatareader [5];
}
If (pdatareader [6]! = Dbnull. value)
{
This. shipvia = (int32) pdatareader [6];
}
If (pdatareader [7]! = Dbnull. value)
{
This. Freight = (decimal) pdatareader [7];
}
If (pdatareader [8]! = Dbnull. value)
{
This. shipname = (string) pdatareader [8];
}
If (pdatareader [9]! = Dbnull. value)
{
This. shipaddress = (string) pdatareader [9];
}
If (pdatareader [10]! = Dbnull. value)
{
This. shipcity = (string) pdatareader [10];
}
If (pdatareader [11]! = Dbnull. value)
{
This. shipregion = (string) pdatareader [11];
}
If (pdatareader [12]! = Dbnull. value)
{
This. shippostalcode = (string) pdatareader [12];
}
If (pdatareader [13]! = Dbnull. value)
{
This. shipcountry = (string) pdatareader [13];
}
}
}
In the ing object, the index is used instead of the field name, because the index retrieval is faster than the key value acquisition (2-3 microseconds) in the test ), that is, the difference in efficiency has reversed the entire situation (the efficiency has changed from a slow speed of about 50% to a fast speed of about 25% ).
Test results:
Orders: 0.055864642014557700000000000
Dataset: 0.110196687009103000000000000
Orders: 0.023227253743143300000000000
Dataset: 0.020698999453841200000000000
Orders: 0.016390910017893300000000000
Dataset: 0.023404929956181600000000000
Orders: 0.017637995890539200000000000
Dataset: 0.025213819074770700000000000
Orders: 0.017227887901954000000000000
Dataset: 0.024791977751362300000000000
Orders: 0.017724040345909900000000000
Dataset: 0.022490009205080500000000000
Orders: 0.023765031589210400000000000
Dataset: 0.021654986876823700000000000
Orders: 0.015963760757303000000000000
Dataset: 0.021302707467010500000000000
Orders: 0.015803405181384800000000000
Dataset: 0.020940929643292700000000000
Orders: 0.015981081394423000000000000
Dataset: 0.020426059736642500000000000
Orders: 0.015780217876853100000000000
Dataset: 0.024939482531997800000000000
Orders: 0.015689982944759700000000000
Dataset: 0.020697043898037300000000000
Orders: 0.015734681363134100000000000
Dataset: 0.021563913849385900000000000
Orders: 0.026096892202780000000000000
Dataset: 0.020594516900891000000000000
Orders: 0.015682998816888700000000000
Dataset: 0.020990656633734200000000000
Orders: 0.015992255999016600000000000
Dataset: 0.021358021759748800000000000
Orders: 0.016322465564757500000000000
Dataset: 0.020486961331677600000000000
Orders: 0.015694173421482300000000000
Dataset: 0.028225933742975700000000000
Orders: 0.015794744862824700000000000
Dataset: 0.022132701223200200000000000
Orders: 0.015866821062453500000000000
Dataset: 0.020778059781340900000000000
However, filling the object set lacks many dataset functions. writing methods to get information for each ing object is also a huge workload, but codedom can be used to generate an operation object too much at runtime to replace the relevant workload (codedom application Description: http://henryfan.cnblogs.com/archive/2005/11/28/286036.html ).