Let's take a look at the root cause of interaction between the data factory and the database in the dalfactory in petshop-sqlserverdal, which uses sqlhelper. I believe everyone will not be unfamiliar with such files as CS. before accessing the petshop database, we should first model the object of the relationship, that is, the O/R model we usually refer, in the modle layer, model the user Address [address] in the [account] table. [of course, this is the division of object granularity by Microsoft development team, you can also expand to the balance of the account extraction and so on.] This idea can minimize the impact of database expansion in the future. and I like it ^
Namespace petshop. model {
/// <Summary>
/// Business entity used to model addresses
/// </Summary>
[Serializable]
Public class addressinfo {
// internal member variables
private string _ firstname;
private string _ lastname;
private string _ address1;
private string _ address2;
private string _ city;
private string _ state;
private string _ zip;
private string _ country;
private string _ phone;
Public addressinfo (string firstname, string lastname, string address1, string address2, string city, string state, string zip, string country, string phone ){
This. _ firstname = firstname;
This. _ lastname = lastname;
This. _ address1 = address1;
This. _ address2 = address2;
This. _ city = city;
This. _ state = State;
This. _ zip = zip;
This. _ Country = Country;
This. _ phone = phone;
}
}
}
Above, I omitted all attributes and only extracted the main model for discussion.
At the underlying idal interface data access logic layer, we can see the sqlcommand used. text and the result returned by datareader using a method similar to a pointer can only meet the required minimum data volume, rather than using the entire dataset as the data transmission media,
Match the fields of the original model. In the future, if the database expands the fields, you only need to adjust the model and the access function once. Other bizlogic functions will work normally.Code:
Namespace petshop. sqlserverdal {
/// <Summary>
/// Summary description for accountdalc.
/// </Summary>
Public class account: iaccount {
Private const string SQL _select_account = "select account. email, account. firstname, account. lastname, account. addr1, account. addr2, account. city, account. state, account. zip, account. country, account. phone, profile. langpref, profile. favcategory, profile. mylistopt, profile. banneropt from account inner join profile on account. userid = profile. userid inner join signon on account. userid = signon. username where signon. username = @ userid and signon. password = @ password ";
/// <Summary>
/// Ing the returned result set to the address model /// </Summary>
/// <Param name = "userid"> </param>
/// <Returns> </returns>
Public addressinfo getaddress (string userid ){
Addressinfo address = NULL;
Sqlparameter [] addressparms = getaddressparameters ();
Addressparms [0]. value = userid;
Using (sqldatareader RDR = sqlhelper. executereader (sqlhelper. conn_string_non_dtc, commandtype. Text, SQL _select_address, addressparms )){
If (RDR. Read ()){
Address = new addressinfo (RDR. getstring (0), RDR. getstring (1), RDR. getstring (2), RDR. getstring (3), RDR. getstring (4), RDR. getstring (5), RDR. getstring (6), RDR. getstring (7), RDR. getstring (8 ));
}
}
Return address;
}
I omitted the connection fields. in ADO. net, we all know that dataset is disconnected and stored in the system memory. After opening the data connection, petshop does not read the data immediately, instead, the datareader is passed to another object to perform the data read operation, and then the connection is closed. In this way, the data connection time is extended, and the database connection is a very valuable server resource. In contrast, dawamish fills in immediately after connecting to the database, then, the rapid release of database connections is more conducive to concurrent access by a large number of users. therefore, I personally think that the master configuration table should still be read at one time. After all, in general applications, the customer account Table generally has no more than 10000 entries. As the data with high access volumes, you only need to regularly refresh the memory, then, you only need to use rowfilter to retrieve data and avoid frequent interactions with links. This will produce better results in practical applications. for orders such as orders, it is more appropriate to perform realtime access to the database. after all, for enterprise-level development,ProgramRobustness, stability, and performance are the primary considerations.
Let's take a look at duwamish's data access. The preliminary work is to correspond to the table datatable, so that we don't have to worry about writing too many field names again next time. This is a physical task, the architecture used by the 110 fields I maintained last time is really horrible. After writing the modules, I have carried out 110 fields. TMD
Public class MERs: idisposable
{
//
// Datasetcommand object
//
Private sqldataadapter dscommand;
//
// Stored procedure parameter names
//
Private const string pkid_parm = "@ pkid ";
Private const string email_parm = "@ email ";
Private const string name_parm = "@ name ";
Private const string address_parm = "@ address ";
Private const string country_parm = "@ country ";
Private const string password_parm = "@ password ";
Private const string phone_parm = "@ phonenumber ";
Private const string fax_parm = "@ fax ";
/// <Summary>
/// Constructor for MERs.
/// <Remarks> initialize the internal datasetcommand object. </remarks>
/// </Summary>
Public MERs ()
{
//
// Create the datasetcommand
//
Dscommand = new sqldataadapter ();
Dscommand. tablemappings. Add ("table", customerdata. customers_table );
}
} // Class MERs
} // Namespace duwamish7.dataaccess
Public customerdata getcustomerbyemail (string emailaddress, string password)
{
//
// Check preconditions
//
Applicationassert. checkcondition (emailaddress! = String. Empty, "email address is required ",
Applicationassert. linenumber );
Applicationassert. checkcondition (password! = String. Empty, "password is required ",
Applicationassert. linenumber );
//
// Get the customer Dataset
//
Customerdata dataset;
Using (dataaccess. Customers customersdataaccess = new dataaccess. Customers ())
{
Dataset = customersdataaccess. loadcustomerbyemail (emailaddress );
}
//
// Verify the customer's password
//
Datarowcollection rows = dataset. Tables [customerdata. customers_table]. Rows;
If (rows. Count = 1) & rows [0] [customerdata. password_field]. Equals (password ))
{
Return dataset;
}
Else
{
Return NULL;
}
}
But now I have developed many O/R Mapping tools, and many ing codes can be handed over to them. After all, enterprise-level development has to do well. The selling point is that your business logic is strong, it can help them analyze and solve problems and provide reference in the knowledge base. another benefit of using dataset is that the returned dataset or datatable can be used. dataset. NET Framework. readxml, dataset. getxml performs XML Conversion, But it is said that the overhead is relatively large. XML, a cross-platform medium, is mainly used for orders, financial reports, and configuration, in general, applications will not cause great sales.
By the way, I still have a cainiao-level problem. How can I paste the color source code? The style-> code method provided by blogger is too ugly to see the keywords clearly, I hope someone will tell me. thanks ^