Migrating from ADO "classic" to Ado.net

Source: Internet
Author: User
Tags exception handling execution extend odbc sql net net command string
Ado

This digest is from hitchhiker ' s Guide to Visual Studio and SQL Server(7th Edition)

William Vaughn
Beta V Corporation

Applicable to
Microsoft ado.net
Microsoft SQL Server 2005 (code name "Yukon")
Microsoft Visual Studio 2005 (Code "Whidbey")

Summary: Bill Vaughn discusses the conversion of Visual Basic 6.0 ADO code, which can be used in. NET applications to perform roughly the same operations.

The author explains that at this time last year,Microsoft invited me to write an article to help COM based of the ADO developers understand porting data access code to . NET mechanisms and problems. This year, they want me to update this article and add information about the ADO 2.0 . As I am currently engaged in the new edition of Hitchhiker's Guide book, I only extract a fragment from the chapter, readers. hitchhiker ' s Guide to Visual Studio and SQL Server (7th Edition) will be Whidbey and Yukon are published and released by Addison Wesley .

Thanks to Calvin, Hobbes and Bill Waterson. For some time, I have been fascinated by the term " metamorphosis ". This technique (apparently) has been rage and is allegedly used to describe the process of turning a frog into a man (or vice versa). However, the variants used in this article refer to the conversion process of Visual Basic 6.0 ADO code, which can be used for. NET applications to perform roughly the same operations. Deformation means that the conversion results can only be identified from the original source-in this case, it is entirely appropriate to convert the COM based ADO code (from Visual Basic 6.0) to Ado.net. However, some people think that this situation is not likely to happen, I do not agree with this article, I would like to trace.

Why do I need to migrate?

On a sunny afternoon, I repaired the three-speed bike on the lawn behind the house, and my father reminded me that I should not repair parts that were not damaged. I agree with both hands. This makes no sense unless there is compelling reason to modify the functional code and make a reasonable cost-benefit analysis before. The word "reasonable" means, not the temptation to peddle emotions (the temptation may come from your chief programmer, or from your spouse's relatives) to transfer, taking into account a comprehensive analysis of various costs, including designing, developing, testing, deploying, and supporting applications, and for developers, Support the team and customer for retraining expenses. In order to run the framework and new applications, you may also need to make hardware upgrades, which is one of the factors to consider. It is well known that the cost of converting code, or even just adjusting it, is expensive. Even if you are careful, the code that changes each row can have serious consequences and side effects (although it is usually unintentional). Nevertheless, there are sufficient and reasonable reasons to transform existing code, such as:

Perhaps existing code cannot take advantage of hardware or Improvements to the OS. For example, perhaps the code is designed to work with (and possibly only) Windows 9x, and the customer has been upgraded to Windows XP.

Perhaps the existing code is less competitive than its The code that his company writes, the latter's application is faster, more reliable, and has a larger share of sales. This happens all the time, because customers always strive to stay ahead in terms of features, features, and competitive prices.

Perhaps new applications are more scalable , be able to support more users, be more secure, easier to deploy, or easily implement features that are not available in existing technologies.

Maybe the client complains that the code seems to work a It freezes after a period of time-especially after installing some other software.

Maybe (perhaps most importantly) you send New development platforms can improve the ability to create new applications, support code, and small-scale users who use the platform.

I'm not going to explore the decision-making process from the existing code base to the tempting prospect of new technology. I'll leave it to your IT staff. If you read the purpose of this article to master the mechanism for converting existing COM based ADO code (which I call "ADO classic" or ADOc) to run in Visual Basic. NET applications, continue.

How did we get here?

The data access interface introduced by Microsoft has undergone a change over the year. Initially, these interfaces were designed specifically to access specific versions of the Connectivity engine technology (joint Engine Technology,jet) DBMS and SQL Server (through db-library). As the technology matures, other "inheritance" Interfaces or generic (ONE-SIZE-FITS-ALL,OSFA) interfaces, such as ODBC and OLE DB, are inherently accessible to almost all data sources. Create COM based ADO (I call it ADOc) for easy access to OLE DB.

ADO " Classic " In the play

Time flow, ADOc is evolving and its later 8 versions are also widely used and integrated into COM based Visual Basic. ADOc developers have also learned how to build applications that manage databases of all sizes and apply them to client/server, middle tier, and ASP architectures around the world. ADOc also supports stored procedures (including complete IO parameter management), automatic UPDATE query generation, asynchronous operations, client and server-side cursors, RO/FO data streams, and so on, and is generally accepted. However, there are some problems with COM based ADO. Its reliance on the MDAC stack makes it prone to DLL Hell problems-sometimes deployed applications fail when upgrading MDAC.

introduced ado.net

To address the problem of having to replace a serviced application component, Microsoft has created the. NET Framework. In addition, Microsoft has created a new data access interface-ado.net. In the early stages of creating the ado.net process, Microsoft developers called the new data Provider "Xdo". This is because the new data access paradigm is embedded into the interface through XML, so it can read XML files to populate its objects, or extend the XML as needed to pass data and schemas to other tiers. So the name is profound. However, Microsoft believes that if you create another data provider, developers will be overwhelmed or even annoyed, so name it "ado.net." Of course, ADOc and ado.net function the same at a higher level. However, the two are very different in the background of the working principle , and in my opinion, most of them are excellent.

When Ado.net first appeared, it lacked many of the basic features that now seem ripe for adoc to support. These basic features include batch file updates, asynchronous operations, server-side cursors, runtime Operation command generation, and so on. Some people think that ado.net is designed for ASP.net, which is not necessary for client/server applications, but Microsoft firmly believes that disconnected datasets can make client/server applications more efficient. and leave it out of the dependency on the design that is difficult to extend, which relies on server-side state management. It is because of this logic that ado.net does not contain support for server-side cursors. At the heart of this new mindset is the "Disconnected" client data store, which you can easily keep and serialize to XML as needed for delivery to other tiers. It also works well for Microsoft's new XML Web services-service-oriented Architecture (SOA) program. Note that the XML used to preserve the ADOc Recordset is incompatible with the expected XML format of ado.net. (See much ADO about data:doing the Impossible (Again). Microsoft also believes that it is best to allow developers to build their own SQL operations commands (UPDATE, INSERT, DELETE) or through the wizard rather than relying on the property-driven ADOc UPDATE method code, because trying to figure out how to change the Recordset , the code often goes wrong. Of course, in some simple cases, this code can also implement CommandBuilder to automatically build an operation command, but as I said in this article, I don't think you will take this approach. (See weaning developersfrom the CommandBuilder.) Indeed, these problems are not an unresolved solution, but additional overhead can further impede migration or make the migration process even more difficult.

Solve problems

Microsoft, on the other hand, allows developers to write managed interfaces that directly resolve issues related to OSFA. That is, Ado.net exposes a. NET data provider dedicated to Microsoft SQL Server (SqlClient) and other databases, including Oracle and DB2. This means that the data provider engine can take advantage of all the capabilities of the DBMS and communicate with the database at the lowest level. Since Dblibrary, Ado.net has been able to interact with SQL Server for the first time through a tabular data stream (TDS), which is a low-level protocol. This means that the SqlClient provider can access all SQL Server features and understand the nuances of these features, making the application even more powerful. For developers, there is no question of converting one DBMS code to another because the local interface is similar to the. NET OSFA interface implemented by OLE DB and ODBC. NET data providers.

Ado.net also exposes another feature of dblibrary-direct access to data streams returned by low-level data interfaces. This interface is implemented as a DataReader by linking your code to the input data returned by the query in a forward-only, one-line manner when establishing a connection. All methods in ado.net use DataReader to get the result set directly or in the background. This means that the application can get data faster and control the process better.

The development techniques of migrating ADO Classic code

To a large extent, the data access logic in most applications can be divided into several parts:

Getting connections : This involves constructing connection strings, (in some cases) integrating user-supplied credentials, and establishing connections-either in real time or during the operation cycle.

managing queries : This means creating query strings and integrating user-supplied parameters, as well as managing transactions.

Execute Query : This includes selecting the appropriate execution method to match the returned result set. Sometimes it means using a method that can receive rowset, scalar, stream, or just the action command.

Manage (multiple) result sets : These routines store and process rowset, returned parameters, or bind results to controls. When you work with hierarchical data, these routines may also manage the relationship between returning rowset collections. In some designs, they can be positioned not only through the client or server-side cursors, but also by sorting, filtering, or locating rows.

Administrative Changes : When data changes, these routines move data from a static source or a bound control to an ADO data structure to manage update, insert, and delete operations through the ADO data structure. These routines also manage concurrency conflicts, batch operations, and transactions.

Administrative Exceptions : All of these routines have exception handlers that handle common problems that occur when establishing a connection, preparing and executing a query, or distributing data to a server.

ADO developers will find that they can decompose and write ado.net code in a similar way. However, there are differences in each section, which is a bit frustrating at first, but can be handy if handled alone. The Visual Basic 6.0 Conversion Wizard does not convert ADOc code to ado.net-this is wishful writing because Ado.net uses a different programming method than ADOc. For example, if you use the shape syntax in ADOc to manage hierarchies, you will find that the. NET Framework does not support shape, but it supports the management of multiple related rowsets through the DataSet class, and a dataset class can contain multiple DataTable objects. Each DataTable object contains a separate set of rows.

Each. NET data Provider is responsible for implementing the Ado.net class that conforms to its own "style." They all support a set of "core" operations that use (roughly) the same properties and settings. Of course, each. NET data provider also supports its own SQL variants and ConnectionString settings. For example, the SqlClient provider (for Microsoft SQL Server) can implement SqlConnection, SqlCommand, SqlDataAdapter, and SqlDataReader, while the OLE DB provider supports OleDbConnection, OleDbCommand and so on. In the books I write, these objects are referenced by function-for example, SqlCommand is called the command class.

The Ado.net Connection (SqlConnection) class is similar to the ADOc Connection object, but it accepts a different (but familiar) ConnectionString. It cannot be opened asynchronously, but it can be managed automatically by DataAdapter when the Fill and Update methods are executed. In Ado.net, you cannot execute directly against Connection objects sql-need to build a command to complete this operation. You also modify the connection error handler so that it anticipates the same type of problem that occurs when you try to adoc the connection. Note that the connection pooling mechanism used by ADOc is approximately the same as ado.net, but it is easier to manage pool options and state by Ado.net.

The Ado.net command class is similar to the ADOc Command object, but in this case, you must build a Command object to execute SQL or run stored procedures. The Ado.net Command object exposes a different approach than the ADOc method. Unlike ADOc, you can access a new low-level interface,-datareader (SqlDataReader), which exposes a high-speed raw data stream to return query data. The Command class supports a Parameters collection similar to the Parameters set used in ADOc. Note that the SQL parameters are marked differently in ado.net.

You will also find that the ado.net DataTable is roughly equivalent to the ADOc Recordset. Although it is not managed as a cursor, it is more efficient to store and manage the returned rowset. Locating a particular row is as easy as array addressing. With a DataSet, you can also manage multiple rowsets of rows from different data sources at once. This class is used to manage one or more DataTable objects and the collection of rows that these objects contain. You can write relational code between these rowset, even if they come from a dispersed source, and easily locate, filter, and search based on the parent-child relationship you define.

Data binding has also changed. Instead of working through the red tape of Visual Basic 6.0 data binding, you can easily bind to a rowset by using drag-and-drop generated code in your code or by setting up data-binding links. Ado.net 2.0 The improvement of the data binding paradigm makes the process simpler.

Addressing the ADOc Recordset is expensive because all columns are returned as a Variant. Ado.net can take advantage of strongly typed DataTable and DataSet, so object addressing is a piece of cake. This means that the data is stored and managed in the original type-that is, the integer is stored as an integral type and the string is stored as a string. You may also notice that a strongly typed operation is a faster instruction. We also encourage you to use the "option Strict on" and "option Explicit on" Options in your development environment. Although this means that the code must explicitly force the variable (through the code conversion type), the resulting code will be more stable so that unexpected data arrives without failing as usual.

To make managing table updates easier, ado.net DataAdapter imitates the Visual Basic 6.0 Data Object Wizard (DOW). This class allows you to define your own UPDATE, INSERT, and DELETE sql-a special SQL query or stored procedure. This makes ado.net more lightweight, but the code takes responsibility for generating these commands-ado.net no longer try to generate them at run time like ADOc. As we'll see later, Ado.net 2.0 reintroduced batch file updates and asynchronous operations to improve performance.

As with ADOc, exception handling is an important part of ado.net design. Fortunately, the. NET Framework supports Try/catch exception management, which is more powerful and flexible than the traditional Visual Basic 6.0 "on Error" routines that you are familiar with. This method of exception management allows you to filter out the exceptions that are caused by other problems with specific data-centric exceptions. This makes it simpler to write exception handlers and less likely to cause an application to fail unexpectedly.

What are the new features of Ado.net 2.0?

The latest version of Ado.net 2.0 fills some of the gaps left by migrating adoc and implements some unheard of functionality. These innovations allow you to generate safer, more robust, faster code-especially Windows forms (client/server) applications. These upgrades include:

No MDAC dependence. earlier versions of ado.net typically require upgrading the MDAC stack (including the DLLs needed to run ADOc applications), and Ado.net 2.0 terminates this dependency when using SqlClient. Earlier versions of MDAC (2.6-2.9) were able to meet OLE DB and ODBC. NET data providers. This means that installing. NET applications is less disruptive because it does not affect existing adoc applications.

Asynchronous Operation . Although Ado.net 2.0 cannot open the connection asynchronously as ADOc, it can execute the DataReader command asynchronously. This means that you can write code that renders a progress bar for the user during query execution, or performs other work on the application thread, and returns a status event based on progress.

Multi-active result set (MARS). Before ADO 2.0, we could not perform multiple operations with one connection. But MARS can support multiple operations on the same connection. This means that you do not have to open multiple connections while reading and updating rows at the same time. Of course, this assumes that the. NET data provider and the target DBMS support this feature.

batch processing . ADOc supports the ability to send multiple operation commands to the server in one round trip. This feature was ado.net in 2.0. By using the batch Update method operation, the application changes more quickly and has fewer round-trip trips to the server (and cheaper).

BCP support . When you move data out of an external or generated data source, it is important to use the bulk copy utility (BCP) instead of the traditional ado.net method. Ado.net 2.0 now contains a BCP class to allow direct access to this high-speed data upload feature. In a part of the time that traditional ADO method needs, this method can complete the bulk data transfer.

will be DataSet Public for DataReader. To make it easier to expose data between layers and layers, you can now create a dataset (or DataTable) and pass the dataset as DataReader to another layer. You can also load a DataTable directly from DataReader.

DataSet serialization . In earlier versions, you could use Remoting to pass a DataSet between tiers, but the data was passed in XML format. In Ado.net 2.0, the DataSet can be serialized into binary format. This means a dramatic increase in the performance of the layers, but to use Microsoft's proprietary format to transfer data. The other option (Schemaserializationmode=typeddataset) removes the schema from the data stream, reducing the amount of data being transmitted and still applies to cross-platform scenarios.

Yukon support . Ado.net 2 can support local SQL Server 2000 data types for the first time, and may also support CLR based user-defined types. These data types include varchar (max), nvarchar (max), varbinary (max), and new XML types.

the new common base class . To make it easier to write generic applications, Ado.net 2.0 exposes a db* base class. For example, public dbconnection, DbCommand and DbDataAdapter. These classes still require provider-specific SQL syntax, but they can be used to write applications that can access multiple different data sources.

Server and Provider enumeration . These new classes allow you to explore the functionality provided by the. NET data provider, as well as the servers and instances that are visible to the application. These classes are useful when writing administrative tools, starting/stopping/pausing servers, or simply determining server state. Note that in addition to the Sql-dmo,sql server, SMO is now exposed to manage instances of SQL Server.

the new counter. The SqlClient provider exposes many counters in earlier versions of. NET, but none of them are reliable. These new counters will be more accurate and can determine the state of the connection pooling mechanism.

connection pooling Management. not only does ADO. NET 2.0 allow you to clear one or all of the pools, but it also allows you to clear other features that govern the behavior of the pool, and earlier versions only allow you to select connection pooling options. The improved connection pooling mechanism also helps you detect bad pools-a pool that is connected to a server that is not available.

Whidbey Integrated

ADO has been integrated into visual Basic and visual Studio for some time. However, the Visual Studio 2005 team has progressed more deeply at the integration level this time. Although you won't see wizards to create familiar DataAdapter, you'll see drag-and-drop techniques that can generate many familiar and new strongly typed data structures. Keep in mind that these tools are most interesting when you want the RAD interface to generate code, especially in simple cases. Microsoft claims that they have also done a lot of work to support N-tier development. You can create an application that builds a dataset within the same assembly, or you can create an application that accesses a dataset and its own objects in a referenced assembly. When the design becomes more complex, these tools can expose your own generated middle-tier business objects. The discussion of this integration is beyond the scope of this article, but it will be covered in my book.

What is the alternative?

For any software solution, there is a dozen alternative solutions-after a solution is complete, someone will always be able to point out a dozen workarounds for each. Of course, each of these methods has its own pros and cons. In this article, we will focus on the problem of data access interfaces and leave the rest of the transformation tasks to others. Of course, your data access code may not be that easy to isolate. Although so far most industry academics and I have been talking about the 3-tier (or "N"-layer) design for decades, not everyone has adopted our advice. If you create an application with a set of independent middle-tier objects that work with data, it is easy to convert these objects to support ado.net tasks. However, if your code is fragmented, like the "spaghetti" code I've been reviewing for years, you might experience a much-needed process of adoc routines from the application's structure without the application falling apart-like fighting a Russian wolf dog in early spring, Try to pull out the dog hairs on the side of it. Let's take a look at the workaround-there must be a way to help you and your skills.

Convert your project to a ado.net project by using the Visual Basic 6.0 Import Project Wizard. The new Visual Studio 2005 Conversion Wizard is said to be more intelligent than the previous wizards, but it still cannot convert ADOc code to ado.net. While it may take some time to apply it in larger and more complex projects, implementing transformations should be smooth sailing for simpler projects. After conversion, you will get the ADOc code that transforms and wraps in the COM interop layer so that it can be compiled in a new Visual Basic. NET project and deployed and run on the target system, along with the registered MDAC version. It also means that you need to be aware of coding techniques (some of the late-binding techniques don't work well), and I discussed them in the article I wrote this time last year.

Another possible approach is to access each ADOc code section and decompose it logically. Imagine if you could complete a task with a disconnected DataSet, or the task would only use DataReader to return rows-this is similar to the default fire hydrant Recordset. In this case, switching to Ado.net is a simple and clean process. When dealing with server-side cursors (particularly common in client/server applications), you need to decide to redesign the code to use a disconnected DataSet policy, or consider using the ANSI CURSOR operator to manage the server-side cursor classes that you create. In many cases, making some design changes eliminates the need for server-side cursors. However, this may not be realistic, so please reconsider.

If you separate the ADOc code from the middle-tier component, the third method is particularly useful. In this case, "simply" creates a ado.net version of a component that returns a similar structure to replace the ADOc code-assuming that the component consumer does not return the ADOc Recordset to another layer. If you expose a custom business object class, it is easy to warp it into a ado.net data access component through the tools in Visual Studio 2005.

Main considerations

Before embarking on a drastic transition process, you need to consider the starting point and ask the following questions: "Can I overcome all the difficulties to achieve the transformation?" "Consider the following factors that affect the conversion process:

What kind of architecture do you migrate from? If you migrate from an ASP programming system, the asp.net approach will be very powerful, and the way you manage ADO objects is very different. You'll love this new method, although it can't handle bidirectional data binding until now, but it's OK.

How are your skills? What kind of computer language are you good at? If you have written many Visual Basic 6.0 code and have been using Visual Basic. NET like many people for a while, it may be easier for you to "logically" migrate your code. This is the transformation operation, not the code itself. As long as you do not use a server-side cursor, you can take advantage of most connection strings and SQL. If your skills are not currently up to this level, it may be easier to try starting with the Visual Studio 2005 Migration Wizard, because most simple cases are not difficult to convert.

To what extent is your code dependent on server-side functionality, such as a keyset, a dynamic cursor, or a routine that manages server-side state (such as #temp table)? In this case, the conversion process will be more difficult because even the latest Ado.net 2.0 does not directly support this feature. However, there is an alternative-using the ANSI CURSOR command. (see my article for using the ANSI CURSOR statement ).

Are you migrating from the ADO Classic (ADOc 1.0 to 2.6), DAO, ODBC, or some other private data provider? Migrating from a newer version of ADO is much easier, because many concepts are similar-not the same, but similar. If you use the Visual Basic 6.0 Migration Wizard, you can transplant most ADOc code without changing it. Of course, there are some problems that I have mentioned in previous articles.

However, migrating from DAO, ODBC APIs, or private interfaces is another matter. The Automation wizard may not be able to convert your code at all, even if you try. If so, I recommend that you re-analyze the DBMS engine. If you use DAO first and then use Jet, I usually recommend that customers migrate from jet to SQL Express because it is stronger, more stable, and has a higher scalability. Microsoft is replacing JET as quickly as possible, and I recommend that customers and students replace it as soon as possible.

What is the binding control you are using? Although some simple controls (such as TextBox, ListBox, and ComboBox) have a fairly simple conversion path, the DataGrid is not. The binding mechanism in Visual Basic. NET is fundamentally improved and significantly different, so you may find some inconsistencies in migrating complex bound controls. Many properties are different in the. NET equivalent control, and some methods are not. Third-party controls are typically provided in. NET versions-you might want to investigate them before venturing into them. As for custom controls, Visual Studio 2005 translates these controls better, but don't expect them to be too high-I suspect there are many problems with the new functionality of the Conversion Wizard.

Are you ready to write your own SQL Operations command logic? Don't forget, ADO Classic creates the SQL for you based on the SELECT statement at run time. Instead, ado.net encourages you to write the code for these operations commands yourself. As mentioned earlier, you can use CommandBuilder and quickly discard its limitations. I think you'll find that it's not as flexible as the ADO classic Update method-especially considering that the Update Criteria property allows you to change the type of action command you create.

Summary

Yes, ADO "classic" code can become ado.net. is the implementation process as difficult as it looks? I have doubts about that. Once you understand that Ado.net is a new (and better) data provider rather than a adoc, you will be more satisfied with the conversion process. Not all applications need to be converted. As I said before, there is no need for repairs if the parts are not broken. I hope this will help you. If you need more help, here are a few books to come in handy. Anyway, I hope this article will help you.

Related books

Ado.net and ADO examples and best practices for VB programmers

Ado.net examples and best practices for C # programmers

1. Deformation (Tràns-mòg´re-fì´, trànz-), and transitive verbs. into another shape or form, especially a bizarre one.
(the American Heritage Dictionary of the 中文版 Language, third Edition)



Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.