Join the social network of Tech Nerds, increase skill rank, get work, manage projects...
  • Ado.Net : Best practices for C# programmers

    • 0
    • 0
    • 0
    • 0
    • 0
    • 0
    • 0
    • 0
    • 153
    Comment on it

    Hi Friends,

    All of us use ADO.Net one or another way while writing code for database connectivity. In .Net whether you are using your custom classes for data connectivity or using ORMs(Entity Framework) or Enterprise Library, you are using ADO.Net for the purpose. So obviously there should be some good practices or dos and don'ts for it. Let's get to it.

    ADO.Net provides us two types of objects for retrieving relational data and storing it in memory:




    1- DataSet

    It provides a complete set of data that includes the tables ,order and, the relationship among the tables. Where can we use them or in the terms of the topic what are the best scenarios to use DataSets?

    • If you are going to navigate between multiple discrete sets of tables.
    • If data is coming from multiple different sources i.e. a database, an XML file and a spreadsheet.
    • Using an XML web service as DataSets can be passed to the remote client.
    • Reusing the same set of rows for different operations like searching sorting and filtering.
    • Performing a large amount of processing per row. This impacts performance when done with DataReader as it ties up the connection for that time period.
    • Manipulating data using XML operations like XLT or XPATH queries.


    2- DataReader

    It provides a fast, forward only and read only stream of data. We can use it when:

    • We do not need to cache the data.
    • We're processing a data which is too large to keep in memory.
    • Need to quickly access data at once, in read only a forward only mode.

    Now that we've seen the scenario of using Data Reader and Data Set, we'll move on to know about the best practices in ADO.Net.

    1- Design your DAL(Data Access Layer) According to the Data Used in Your Application:

    This simply means that just get the data required, not the whole bunch every time just for the sake of reusability.

    2- Cache Data to Reuse:

    Caching data can reduce the load on your database server.By caching data we can avoid the frequent connection to the databases.

    3- Acquire Late Release Early:

    Open database connections right where you need them. And close as soon as you're finished.Do not open them early and don't hold them open across calls.

    4- Reduce Round Trips:

    Use batch statements whenever possible.Working one at a time requires unnecessary trips.

    Use connection pooling. By using connection pooling we can stop round trips that are there with connection establishment and authentication.

    5-Return the Data You Need:

    Always return the columns and rows you need  and if possible cache data if possible.

    6-Use Windows Authentication Whenever Possible:

    If you are making an application from security perspective, use Windows Authentication. SQLServer Authentication is faster but adds an overhead of protecting the credential in transit between your application and database.

    7-Choose the Right Transaction Type:

    Transaction management done properly minimizes  locking and contention and provides data integrate.There are following three types of transactions.

    • Native Database Support:It permits us to control the transaction from stored procedures.It is limited to a single call but the stored procedure can contain complex statements and can call other stored procedures.
    • ADO.Net Transactions:These transactions enables us to span a transaction across multiple calls to a single data store.
    • Enterprise Services Distributed Transactions:This can be used  when we need transactions to span multiple data stores.

    8-Using Stored Procedures:

    Avoid embedded SQL Statements. Generally well designed stored procedures outperform embedded statements. We can consider following statements in supports of our claims.

    • Logic Separation:By using stored procedures you can clearly separate the business logic from data manipulation logic  for performance, maintainability and flexibility benefits.
    • SQL Optimization:Prior to MSSQL 2000 a query execution plan was kept for faster execution of stored procedures but that was not for dynamic SQL. From MSSQL 2000 query exection plan is kept both stored procedures and dynamic sql.
    • Tuning and Deployment:Stored procedures are stored in database and can be tuned with database by database administrators.The application need not to be redeployed when stored procedures are tuned.With embedded statements that's not possible as they are part of application and the application is needed to be redeployed if they are tuned to any level.
    • Network Traffic:Source code of a stored procedure is stored on the server. Only parameters are sent to the server. That significantly reduces the network overhead in contrast to embedded statements as the whole statement is sent to the server for execution.
    • Simplified Batching of Commands:In stored procedure you can do your whole calculation from different tables which require multiple statements and server calls if done  in embedded statements.
    • Data Security:With stored procedures ,access levels can be applied which ensures data security.
    • SQL Injection:Embedded statements are prone to SQL injection attack whereas parameters in stored procedures are forced to be treated as literal values by parameter collection  so the parameters are not treated as executable code.

    9-Handling Exceptions:

    You can use try/catch block to ensure that connections are closed regardless of whether exceptions are generated. There are following gotchas in exception handling though which should be considered:

    • Do not just catch and throw the exceptions if your logic for data access is not adding any value as they are expensive in terms of time.Simpler approach is to permit the exception to propagate from database to all the way  back to caller.
    • If you want to separate caller from data specific details, just catch the exception, log the details to a log store and then return a value(preferably enumerated) from a list of code.

    10-Use Normalization Carefully:

    Using normalization is a very good thing and it makes the db nonredundant. But overusing it may result in multiple joins for most simple queries which directly impacts performance. So we can consider repeating certain columns which are often required.

    So above are the some best practices in ADO.Net. There may be more depending on the practices beneficial for the performance. Please share in the comments if you find any more.

    Thank You.

    Happy Coding:-)

 0 Comment(s)

Sign In

Sign up using

Forgot Password
Fill out the form below and instructions to reset your password will be emailed to you:
Reset Password
Fill out the form below and reset your password: