Skip to content

DBAzine.com

Sections
Personal tools
You are here: Home » SQL Server » SQL Server Articles Archive » Implementing SQL Server in a DotNet Environment: .NET Connection Pooling and Connection Management
Seeking new owner for this high-traffic DBAzine.com site.
Tap into the potential of this DBA community to expand your business! Interested? Contact us today.
Who Are You?
I am a:
Mainframe True Believer
Distributed Fast-tracker

[ Results | Polls ]
Votes : 3554
 

Implementing SQL Server in a DotNet Environment: .NET Connection Pooling and Connection Management

by Eric Charran

Introduction

Many business organizations are developing next generation business solutions which require a large degree of richness and functionality at a decreased total cost of ownership. These business solutions must be constructed in a time-effective manner and often integrate across platforms. Technical teams are finding themselves forced to choose a stable and robust platform to develop these solutions on, while meeting the requirements and constraints of budgets and business needs.

What is .NET?

Microsoft’s .NET platform, nearly two years old (counting the public beta), exists as a next-generation programming and application framework that allows businesses to leverage the power of Rapid Application Development (RAD) to produce and implement stable and rich application environments that solve business problems. This framework incorporates multiple languages and technologies and steers solution development towards constructing functional building block applications that interconnect to represent a comprehensive application or solution. This is a significant paradigm shift from building monolithic applications that were, in effect, information silos, in that the capabilities and elements they contained were inaccessible to other applications. This resulted in many developers having to rewrite code in multiple environments, and languages.

Nothing New Under the Sun

While the overall multi-tier concept for application development (data tier, business logic tier and presentation tier) has not changed, the methods and used to implement these concepts have. One common element between most software frameworks is the interaction of the business code with the data tier.

Microsoft has taken a revolutionary departure from the traditional methods in which developers accessed data in the data tier of multi-tier applications. By revolutionizing Active X Data Objects (ADO) and supplanting it with ADO.NET in the .NET Framework, persistence, performance and management of data has taken a quantum leap.

At the center of this revolution is SQL Server. While the overall architecture of SQL Server has not changed, database administrators and developers must have a good understanding of the methods in which developers will be accessing SQL Server in order to maintain optimal application performance from the data tier. Additionally, data tier architects and database administrators from the SQL Server side, should be in a position to make recommendations regarding data access methods using ADO.NET.

Breaking out of the Database —  Beyond TSQL

With the release of Yukon (the upcoming release of the next version SQL Server), Microsoft is significantly altering the architecture of SQL Server. The Common Language Runtime (CLR) which supports the .NET Framework (providing runtime compilation and code execution capabilities to .NET languages), will now be part of SQL Server. This means that developers will now be able to write .NET language stored procedures, instead of being bound to T-SQL. While the benefit of these changes speak for themselves, database developers and data tier architects must make a significant leap beyond the traditional role of a database administrator and beyond the database to begin to understand these languages and programming techniques. Soon, it will not be uncommon for SQL Server database administrators and data tier architects to write, debug, and optimize stored procedures written in C#, VB.NET and T-SQL. By obtaining an understanding of the concepts surrounding ADO.NET, SQL Server experts will equip themselves for the upcoming Yukon release.

Constructing an Application Data Access Strategy

One of the earliest elements of application design and coding is to construct a data access strategy. The data access strategy will govern how all elements of the application leverage and use the database server. This code is often abstracted into a class or series of classes which can be accessed from all elements of the application. As a result, the data access strategy and its accompanying code must be the most efficient and streamlined block in the application. In any application, one of the major bottlenecks will be the access between tiers to the database.

Microsoft has provided several resources for ensuring that developers understand and utilized best practice methods for accessing data. Resources such as Microsoft Application Blocks (precompiled dlls which provide a sealed database access class) and optimized data connection clients such as the sqlclient, allow for quick coding for data access in an application. Despite the existence of these resources, a well planned data access strategy must be devised and implemented.

Using the Sqlclient

A primary consideration in the foundation of an application’s data access strategy will be the communication and transport methodology used to get data to and from the database. ADO.NET and the latest Microsoft Data Access Provider (MDAC), provide a wealth of capabilities to connect to multiple data sources. An effective data access strategy will elect the best and most efficient series of protocols and classes for use.

.NET applications leveraging SQL Server as a data tier will undoubtedly have an advantage over a heterogeneous application architecture. .NET contains an ADO.NET provider for SQL Server (as well as Oracle) which allows developers to leverage the power of the database through a series of customized libraries which makes use of SQL Server’s capabilities. Microsoft recommends the utilization of this class when developers write applications based in SQL Server as opposed to using the OLE DB provider. This is a shift in recommendations from ADO, which used OLE DB. The sqlclient class will definitely provide an increased amount of communication performance between application tiers.

By selecting the sqlclient as a provider when developing against a SQL Server data tier, the foundation for a sound data access strategy will have been established.

Stored Procedures

Once data access protocols have been selected, application architects will have to determine the most effective method to manipulate and effect changes with data in the database. .NET provides at least two approaches that application architects should examine when determining data access strategy across an application. Stored Procedures provide an encapsulated and manageable approach to interacting with data in a database from an application. Additionally, using the .NET CommandBuilder object to construct SQL Statements to interact with the database is another approach.

Using stored procedures to effect database changes and retrieve data from SQL Server is the best practice method of facilitating communications and transactions from the business layer and the database layer. Stored procedures exist as precompiled code within SQL Server that allow for complex logic and database operations. The consideration of stored procedures when formulating a data access strategy is paramount.

While .NET languages provide methods of building commands using objects like the CommandBuilder object (An ADO.NET object that constructs INSERT, UPDATE and DELETE SQL Statements based on a SELECT statement automatically), stored procedures have an additional benefit as outlined below.

Stored Procedures vs. Dynamic SQL

Many developers have often placed SQL code directly in the business tier (i.e., hard coding SQL Statements as strings and using ADO or ADO.NET to send the string as a SQL statement to the database). This practice is not recommended and is strongly considered a bad practice. One reason to avoid this practice from an application performance perspective is that any SQL statement submitted to SQL Server will be interpreted as a dynamic SQL statement.

In essence, SQL statements built with strings and sent to the database server are equivalent to typing the SQL statement into SQL Server’s Query Analyzer. Because SQL Statements submitted directly to SQL Server are not compiled code, there are several steps that must occur before the statement is executed (syntax check, database object verification, execution plan generation, statement execution). In comparison, stored procedures execute a series of statements in a precompiled fashion (stored procedures simply execute, skipping all the aforementioned steps for execution).

Procedure Cache Crowding

If dynamic SQL is executed often enough, it will become part of the procedure cache. However, multiple string-built SQL statements submitted from the data tier can result in procedure cache crowding. Essentially, the procedure cache becomes filled with variations of the same SQL statement, or overwhelmed by a number of differing SQL strings. While the implications of this may not be apparent immediately, short and long term database execution performance degradation will occur, affecting the whole application.

A defined set of stored procedures will make the best candidates for using the procedure cache. SQL Server keeps advanced statistics on stored procedure execution plans and their usage. This ensures that when popular stored procedures are called by code, they will execute using the benefits of precompiled code. Stored procedures may fall out of the procedure cache if there are an excessive amount of statements and stored procedures that the application runs. However, the majority of stored procedures called by the application on a frequent basis will be preserved there.

Maintenance and Security

From a maintenance perspective, stored procedures allow for the consolidation of code. A basic principle of .NET and software design is for developers to write code once, and reuse it many times. Stored procedures represent a singular point for the authoring and maintenance of specific portions of applications database activities. If SQL Strings are built in the code, additional time and effort will be put into maintaining these strings in multiple places if database object names change. While it may be possible to consolidate these elements into a global file or class, it is not recommended best practice.

Additionally, from a security perspective, ASP.NET applications as well as Windows forms applications may suffer from SQL injection attacks. These attacks are commonly perpetrated by malicious users that append harmful SQL to the end of an inline SQL Statement. The following example outlines a SQL injection attack.

Control Value  VB Code SQL sent to SQL Server
Me.txtName.text = “Eric”
SqlStatement =
“SELECT CustomerID
FROM Customers
WHERE Name = ‘” &
Me.txtName.Text &
“’”
SELECT CustomerID
FROM Customers
WHERE Name = ‘Eric’
Me.txtName.text = “Eric;
DELETE FROM
master.dbo.sysobjects”
SqlStatement =
“SELECT CustomerID
FROM Customers
WHERE Name = ‘” &
Me.txtName.Text &
“’”
SELECT CustomerID FROM
Customers WHERE Name =
‘Eric’; DELETE FROM
master.dbo.sysobjects

 Figure 1: SQL injection example.

Stored procedures and the usage of stored procedure parameters provide a natural defense against the type of attack depicted above. While nothing is foolproof, stored procedure parameters represent an optimal method to pass meaningful elements of data to perform specified transactions within the data tier. Because the code within the stored procedure is localized to one database object, security can be granted on the stored procedure and maintenance is eased.

Data Access Business Objects

Continuing with the theme of write once, use many times, developers should make use of blocks of code which perform specific, generalized functions that enable database access. In the past, Visual Basic 6.0 developers developed classes and objects which accepted commands from business level code and implemented requested transactions in the database and returned ADO recordsets. This concept has not changed with the advent of .NET. It is still a good practice when developing the data access strategy for the application to have a generic data tier management class that handles all transactions and fetches regarding the database.

The goal of the data access block should be to obscure from upper-level code the inner workings of obtaining, assigning and managing SqlConnections, returning data sets an datareaders, or performing non-query (insert, update or delete) actions in the database. The data access layer should also observe best practices in connection handling as well, in order to facilitate proper usage of the connection pool, so developers don’t have to worry about opening and closing connections in upper-levels of the application.

Data Access Block for .NET

The Data Access Block for .NET is one in a series of Application Blocks provided by Microsoft. These Application Blocks are freely available for download from MSDN for use within application development. The goal of the Application Blocks are to alleviate the time it takes for developers to generate and design blocks of code that handle generic functions. The Data Access Block provides a completely enclosed series of methods and properties which allow developers to drop the block into their Visual Studio.Net projects and instantly have a preconfigured data access layer.



Figure 2: Diagram of the SQLHelper class project reference.

By using this application block, developers can eliminate writing a custom data access module and leverage a robust class which is database connection aware, and embodies all database connectivity best practices. In essence, developers can go from code like the following:

Imports System
Imports System.Data
Imports System.Data.SqlClient
Imports System.Drawing
Imports System.ComponentModel
Imports System.Windows.Forms

Dim cString As string = "Persist SecurityInfo=False;Integrated
Security=SSPI;data base=northwind;server=mySQLServer"

Dim cnNorthwind As SqlConnection = new SqlConnection(cString)

' Create a SqlDataAdapter for the Suppliers table.
Dim adpSuppliers As SqlDataAdapter = new SqlDataAdapter()

' A table mapping tells the adapter what to call the table.
adpSuppliers.TableMappings.Add("Table", "Suppliers")

cnNorthwind.Open()

Dim cmdSuppliers As SqlCommand = _
new SqlCommand("SELECT * FROM Suppliers", cnNorthwind)

cmdSuppliers.CommandType = CommandType.Text

adpSuppliers.SelectCommand = cmdSuppliers
Console.WriteLine("The connection is open.")

ds = New DataSet("Customers")

adpSuppliers.Fill(ds)

To the following, using SQLHelper:

Dim ds as DataSet = Sqlhelper.ExecuteDataSet(cString, StoredProcedure, SPSQL)

Whether you roll your own data access block or use the existing downloadable one from Microsoft, the concept of the data access object remains a best practice in data access strategy for applications. This will obscure many of the mediocre workings of database access from upper-level code and ensure a stable transactional and data access component across the entire application.

--

Eric Charran is currently working as a Technical Architect and Web Developer, in Horsham PA. Eric holds Microsoft Certifications in SQL Server and has significant experience in planning, designing, and modeling n-tier applications using SQL Server 2000, as well as architecting and implementing .NET framework based applications, solutions and web services. Eric's other professional skills encompass Database Administration, Data Warehousing and Application Architecture, Modeling and Design, as well as Data Warehouse Transformation and Population using DTS. Eric is also skilled in developing and implementing ASP.NET and Windows forms applications, using technologies such as COM+, the .NET framework, VB.NET and other .NET languages.


Contributors : Eric Charran
Last modified 2005-04-12 06:21 AM
Transaction Management
Reduce downtime and increase repeat sales by improving end-user experience.
Free White Paper
Database Recovery
Feeling the increased demands on data protection and storage requirements?
Download Free Report!
 
 

Powered by Plone