I just recently finished a series of custom development projects which required me to revisit the programming world from which I once came. Programming is something I started doing when I was very young and have always enjoyed. While I’ve done plenty of scripting, Powershell, and T-SQL in recent years, my current job roles have had little need for true programming skills until recently. And wow, things have changed! In this series of posts, I’ll be going through my process for choosing a technology and developing a custom dashboard solution as well as some of the lessons I learned along the way. Here in Part 1, I plan to cover my goals for the dashboard as well as how I chose the platform I did. In Part 2, I’ll cover the more traditional of the two methods I explored and in Part 3, I’ll cover a SignalR solution.
Whenever learning or refreshing skills, it always helps to have a goal. In this case, my objective was to create a dashboard solution with the following objectives:
- Display near real-time data from a OLTP SQL Server data source
- Be aesthetically pleasing as it will be viewed on multiple large screen displays 24/7
- Refresh gracefully with no user interaction or display interruption while refreshing
Generally speaking, you shouldn’t reinvent the wheel if you don’t have to. So, I first looked at what pre-existing options I could potentially use to accomplish my goals before going down the custom development road. Trying to stay within the Microsoft ecosystem and not having a large budget, I initially looked at SQL Server Reporting Services (SSRS), Power Pivot, Power View, and the current Power BI offerings. These options tended to fail either with the lack of customization of the look and feel of the display area or with the refresh mechanism. SSRS, for example, allows a high level of customization over the display area, but data refreshes refresh the entire page at once and gray out the display while refreshing, which was a bit distracting for frequent updates.
As a side note, as of March 2015, Microsoft seems to recognize this gap and have been making great strides toward Real Time Dashboarding with their new version of the Power BI offering. It’s currently still in preview and has a long way to go, but is going in the right direction. I definitely plan on doing more in depth work with it in the near future.
Given that none of the products that I evaluated adequately fulfilled my requirements, I decided the best strategy would be to develop a custom solution. While custom development can be a lot of work up front and can in some cases be difficult to maintain over the years, it is the best way to get the exact product you need without compromise. Having decided I wanted to develop custom solutions internally, I now had to make a few more decisions. Desktop Application vs. Web Application? And having decided that, finally which technologies to use within the application.
Desktop Application vs Web Application
Generally when it comes to displaying data, a desktop application can be easier to implement than a web application since you have complete control within the application. You control the size of the window, what’s in it, how it refreshes, and what resources it’s allowed to access. Additionally, Microsoft makes a number of .NET charting components available freely. This solution does require a program to be distributed to each endpoint that will run the application and that endpoint must meet the application’s pre-requisites (such as having the appropriate version of .NET installed). On the flip side, a web application can be a little more difficult to develop (since you must give up some control of the display to the web browser) and data refresh and security concerns can make data access more of a challenge, but you can run it easily anywhere you have a browser.
Which Platform to Use?
Given that I’d committed to using Microsoft technologies, that pretty much meant using .NET. Had I chosen to go the route of developing desktop applications, I’d be debating the merits of Windows Forms vs. WPF/XAML. But, given that I’d decided to go the web application route, that meant a debate between the merits of Web Forms vs. MVC (had I not decided to stay within the Microsoft ecosystem, Ruby on Rails or PHP would’ve been strong options).
Web Forms is the successor to old classic web pages, which where typically written in a combination of HTML, and ASP/VBScript and used ADO for data access. Being the successor to Classic ASP, Web Forms maintains many of the original conventions of these legacy technologies, but incorporates the latest version of ASP.NET as well as allowing for the use of C#. Both classic ASP and Web Forms treat each page individually — an interaction with the page posts data back to the originating page (usually) and then action occurs. Given the quirkiness and complexity of Web Forms in larger projects, the future of Web Forms isn’t very bright.
MVC, or Model-View-Controller is an architectural pattern for designing applications. To confuse things, the similarly named ASP.NET MVC is a web application framework which implements the MVC pattern on ASP.NET and this is a direct competitor to ASP.NET Web Forms. Confused yet? MVC is very different than Web Forms in terms of how a program is structured (though it still sits on top of ASP.NET and is written in one of the typical .NET languages, such as C#). The largest (and greatest) difference in my opinion is the use of a controller. All interactions with pages go through the controller which is responsible for sending Views (pages) to the user. I won’t go deeper into the details of MVC during this post, suffice it to say that at the time of this writing, MVC5 is far newer than Web Forms and is still being actively developed by Microsoft.
For a project of this simplicity, either Web Forms or MVC is an acceptable choice. In my case, I chose to go with MVC since it is being more actively developed by Microsoft than Web Forms and because it overlaps with some other less BI related projects I was working on.
What Chart Controls to Use?
One Last Choice — Communications Model
At this point, I’ve made a lot of choices:
- Use a custom developed application instead of an off-the-shelf product
- Use web-based technologies within the Microsoft ecosystem
The last choice to make was choosing a communications model — how the application would interface with the back-end database and how the client would interface with the application. I evaluated two different approaches for this: a traditional approach utilizing web services in which the client performs all requests for data and data refreshes versus a more modern SignalR based approach in which the client performs the initial request for data, but refreshes are pushed by the server.
Having made all the necessary design decisions, it was time to get started! In Part 2 of this series, I’ll explore creation of the dashboard application via the traditional communications design and then finally in Part 3 of this series, I’ll explore creation of the dashboard application using SignalR.