Build, protect and deploy apps across any platform and mobile device
Leverage a complete UI toolbox for web, mobile and desktop development
Automate UI, load and performance testing for web, desktop and mobile
Rapidly develop, manage and deploy business apps, delivered as SaaS in the cloud
Build mobile apps for iOS, Android and Windows Phone
Optimize data integration with high-performance connectivity
Connect to any cloud or on-premise data source using a standard interface
Build engaging multi-channel web and digital experiences with intuitive web content management
Automate decision processes with a no-code business rules engine
Guest blogging once again for Jonathan, this is Mike Frost with part 1 of my series on Heterogeneous Data Access in .NET - Coding Directly to ADO.NET
Before I begin though, I should make a comment about the reason for this series. After many conversations with .NET developers and with development organizations using .NET, we have heard a lot of confusion over the different approaches to accessing multiple databases in .NET. In some cases, what we have heard misinformation, in some cases it was a lack of information. In most cases, though, people just had different sets of needs and experiences and those differences often dictated a different set of requirements for everyone we spoke with. So what I'm offering isn't intended to be the "final" answer on this matter - merely a set of advice and guidelines based on our experience with this subject and with others who have wished to learn more about if for themselves.
With that said, the most obvious place to start when discussing how to develop an application that requires access to multiple relational data stores is to talk about ADO.NET. Historically (and even today given the many announcements concerning the ADO.NET Entity Framework), building code that accesses the ADO.NET interfaces directly is easily the most prevalent approach in use today in .NET applications. If you’ve ever built an ASP.NET application that connected to Oracle and used Visual Studio tooling to do it, then the database access code that was generated used ADO.NET under the covers.
The Pros – Coding directly to ADO.NET is perhaps the best option for those experienced developers that have a strong background in ADO.NET, want to maintain fully control of their data and require the power of the database to be at the immediate disposal. Coupled with this, writing to ADO.NET allows for very granular control of database access code which can be leveraged to ensure the most efficient database access for a particular application. Put simply, if you know what you’re doing with ADO.NET, you can probably do a very good job coding your application with this approach.
The Cons – All of that granular control means that a significant amount of application development time will likely be required. In addition, the developer will need to know which ADO.NET providers will be used ahead of time. Finally, unless the developer is a thorough planner and careful coder, he or she can fall into the trap of using provider-specific code. Unfortunately many developers are unaware of these factors during the initial development phase. As a result, it isn’t until after a significant investment of time and effort has been made that the impact of these factors becomes apparent.
The use of Visual Studio tooling during development isn’t inherently bad, nor are most developers coding without some foresight into what they are doing. Unfortunately, there is a natural tendency to take the path of least resistance. This tendency, combined with a certain level of code abstraction that Visual Studio tooling can create, often leads to applications that are mired down in huge masses of provider-specific code.
While this result might be acceptable for single data source data access (e.g. SQL Server only), it does lock the application to the provider it was originally written to work with. Consequently, adding support for additional database providers ultimately becomes a huge headache as application code and SQL statements must all be rewritten to account for differences in provider code, semantics, and SQL statement formatting. Ask anyone who has been faced with this situation and they will tell you that it’s a nightmare to deal with!
It is worth noting that that the use of connection factories (link) can reduce the amount of provider-specific code with this approach. While this does not account for and eliminate all provider-specific coding in this approach, it can help mitigate some of the hassle of trying to support multiple providers and data sources.
So, to summarize:
Coding Directly to ADO.NET
My next post will cover an approach specifically designed to help architect .NET applications requiring heterogeneous data access – programming with the Microsoft Data Access Application Block.
View all posts from Mike Frost on the Progress blog. Connect with us about all things application development and deployment, data integration and digital business.
Copyright © 2017, Progress Software Corporation and/or its subsidiaries or affiliates.
All Rights Reserved.
Progress, Telerik, and certain product names used herein are trademarks or registered trademarks of Progress Software Corporation and/or one of its subsidiaries or affiliates in the U.S. and/or other countries. See Trademarks or appropriate markings.