Jeff Pries

Business Intelligence, SQL Server, and other assorted IT miscellany

Author: jpries (page 1 of 10)

Creating an SSRS Report Using Natural Earth Geospatial Data (A Shapefile Alternative)

In my previous article, I covered creating geospatial SQL Server tables using the freely available Natural Earth resources.  Natural Earth is an extensive public domain map dataset available at 1:10m, 1:50m, and 1:110 million scales in  vector and raster data formats which can be used as an alternative to ESRI Shapefiles for geospatial data.   In this article, we’ll be creating a simple SQL Server Reporting Services report which utilizes this spatial data in lieu of the more commonly used shapefiles to plot data on a custom map.

To get started, we’ll first need to create some assets.  Follow the steps in the previous article to setup the Natural Earth tables.  Next, we’ll create some views to simplify queries, then we’ll create a report using these assets as well as some sample data.

Continue reading

Getting Started with Natural Earth — A SQL Server Shapefile Alternative (Geospatial Resource)

SQL Server Reporting Services (SSRS) has excellent geospatial support for displaying data on a map.  Maps are typically created using ESRI Shapefiles (.shp files).  These Shapefiles are typically created with complex GIS software and made available for download (sometimes free and sometimes not) to be used.  Additionally, SSRS has an excellent default set of Shapefiles built in for the US which can show the country, states, and individual counties.

Example of SSRS Shape File showing Georgia and its 159 counties.

But what about when you need more flexibility in your geographic display?  Some examples of this may be wanting to display something that you can’t find a shape file for (maybe all the states and provinces in North America) or maybe you want to dynamically draw the geography based on some property of the dataset.  Geospatial data queries to the rescue!  Using SQL Server’s native geospatial support, a geospatial query can be created to return something as simple as a point or rectangle, or complex as the geography of an entire country and all of its rivers.

Getting all of the latitude and longitude coordinates to create a useful geospatial query could potentially be an enormous amount of work.  Fortunately, that work has already been done in a freely available resource, thanks to Natural Earth and Laurent Dupuis.  SQL Server 2012 or greater is recommended for this process.

Example of a geospatial query, shown in the SSMS results pane, based on the imported Natural Earth data.

Continue reading

Troubleshooting Error 404, Error 400, or “Invalid Request” or “Bad Connection” in a New SSRS Installation

After installing SQL Server Reporting Services (SSRS), are you receiving an Error 404, Error 400, “Invalid Request” error, or “Bad Connection” error on first visiting the SSRS web portal (the error message seems to vary based on version, browser, and whether accessing via http/https or /reports vs /reportserver) ?

I’ve run into this a few times so I’m listing the steps I’ve used to fix it.  For me, the root cause of this error has been the SSRS Configuration Wizard automatically configuring SSRS to use HTTPS, but assigning an invalid machine SSL Certificate.  The fix is to self-generate a new and valid SSL certificate for the SSRS website to use.  The below steps are done on the machine running the SSRS web portal:

Continue reading

Microsoft Certification Exam 70-768 and 70-466 Study Tips

This past September, I attempted (and recently received a passing score for) Microsoft’s new certification exam, 70-768, “Developing SQL Data Models” during its Beta period.  This brand new exam is a requirement toward the new MCSA SQL 2016: Business Intelligence Development certification.  Last weekend, I took and passed 70-466, “Implementing Data Models and Reports with Microsoft SQL Server” which counts toward MCSE: Data Management and Analytics.  Both of these exams overlap heavily in the topics covered, so if you’re interested in taking them both, its a good idea to study for both and take them back to back.

Brand new exams, as well as higher level specialty exams in general can be a bit of a tricky beast due to the lack of available resources.  For exams which are mainstream (such as Windows Server exams) and have been out for a while, you can count on resources such as MS Press Books targeted toward the specific exam, practice tests from MeasureUp and Transcender, and other useful resources.  Unfortunately, for brand new exams, or many of the high level SQL Server exams, none of this exists.  Having successfully studied for and passed both 70-768 and 70-466.  Below are some tips and resources I used to prepare for each exam.

 

General Tips for Both Exams

The first tip is just to know the question types that the exams typically cover.  Microsoft list all of their question types here, with examples of each.  It’s typically a good idea to pay special attention to the “Build List” type of question, which emphasizes knowing the steps, and order, that task components should be performed in.  There seems to be a lot of love for this question type.

For both exams, the best starting place is the bulleted section of the “Skills Measured” section of the official exam page.  Modern Microsoft exams follow this section very closely — you can practically guarantee there will be a question that ties back to each sub-item for each category.  I like to go through this section and all of the bulletpoints and break it down into small words or phrases that I can then use as a checklist while studying.  I’ve included an example for each exam below in the exam specific sections.

Continue reading

SQL Saturday Roundup: #578 – Atlanta, GA (BI Edition)

Saturday, December 10th marked the 2nd annual SQL Saturday Atlanta BI Edition.  Atlanta is known for its massive SQL Saturday held every spring / summer, so I’m happy to see the smaller, more BI-focused winter event continuing on.  With such a large number of SQL Server professionals in the area, there is definitely room for multiple events.

As with last year’s event, this one was a well-run event with no flaws that I was aware of.  This year seemed to be a bit of a “back to basics” theme.  Many of the extras that are frequently seen at SQL Saturday events — lots of sponsors, attendee bags and printed materials, speaker shirts, paper session evaluations, and other extras weren’t present.  Instead, the focus was purely on providing a full day of content across multiple tracks, and you know what, that’s just fine.  (Many) free donuts were provided for breakfast and boxed lunches were purchased, and everything was adequate.  The core idea behind SQL Saturday is free training and networking, and the event delivered!  I particularly thought the session lineup for this event was a great mix of topics.

Continue reading

Office Hijinks: Halloween 2016 – Haunted Office Cemetery

Office Halloween 2016

What better way to celebrate one of the mot fun holidays of the year than with a few office shenanigans?  This year, we decided to go big and built a giant haunted cemetery, complete with crypt, sound effects, and fog machine in the office.  It was pretty incredible!

It took us around 2 weeks from start to finish, with one very long weekend piecing everything together, but it came out pretty well.  The best and most haunted Office Cemetery I’ve ever seen!  Check out the full album below.

 

 

The Haunted Office Cemetery

 

[ View the Full Album ]

October Lunch and Learn Presentation – Intro to Data Visualization

October Lunch n LearnLunch and learns are a great way for a team to learn new things, share that knowledge with each other, and practice presentation skills.  We typically do casual 30 minute sessions in a group of IT developers which range from .NET to SQL Database to Business Intelligence.

For our October, Halloween-themed presentation, I chose to give an introductory presentation on data visualization, titled “Avoiding the Horrors of Scary Visualizations: An Introduction to Data Visualization.”  The presentation was targeted toward people with no background in data visualization and started with a quick history and some of the key players (Tufte and Few), bridged into some tips and best practices, and closed with a number of examples of “scary” visualization examples.

Overall, the presentation went very well and seemed to be well received.   The  full presentation is available here.

 

Microsoft Certifications and Beta Exam 70-768 First Impressions

successfailsignRecently, I had the opportunity to participate in the beta period for the new Microsoft 70-768, “Developing SQL Data Models” exam.  As part of the development process for new exams, Microsoft periodically offers invitations to take a beta version of the exam free of charge.  Passing an exam while in the beta period results in passing the test officially when it is released.  However, there are a few catches to this.  The first of which is that these betas are designed to be taken by people who already have knowledge of a subject and work with it on a regular basis.  Since the exam is brand new, there are no official study materials to study, and there is typically a short window of time between the announcement and close of the beta period, so there isn’t a lot of time to prepare.  These exams also aren’t for those that feel the need for instant gratification, as it may take a month or longer to receive score results to ultimately find out if you passed or failed.  But, the wait is for a good cause as the beta exams are used to determine the final versions of the exam as well as passing scores, which must be completed before the beta exams can the go through the scoring process to be officially scored.

For those interested in taking a beta exam and qualified to do so, I’ve found that Microsoft usually announces the availability of a limited number of seats via the Born to Learn blog which are available until the allocated number of exams are scheduled.  For this last batch of SQL Server beta exams, 300 seats were made available for each exam.  This may sound like a lot, but this is worldwide, so you have to act fast.  I’ve found that if you don’t act within the first day or maybe two of the announcement, you will probably be too late.  Once announced, you typically have to have the exam scheduled in a proctored environment within around a month of the announcement.

Like all Microsoft certification exams, the beta exams are covered by an NDA protecting the exam content, so I won’t go into exam specifics, but I do have a few general thoughts about the exam and process.  It is interesting to see how the Microsoft exams in general have changed and matured over the years.

  • I’d classify this exam as “tough, but fair” with a heavy amount of reading.  I underestimated the amount of reading, and was thoroughly exhausted at the conclusion of the exam.
  • Due to the length of the questions and answers, and re-reading questions and answers, I used almost all of my time allocated for the exam.
  • The published Exam Objectives were very representative of the exam content (shockingly so) and make a very good framework for preparing for the exam.
  • How an exam is scored is always a black box, but some questions do make mention of how they award partial credit, such as “1 point per part of correct answer” which I found to be a welcome change — nice to know for sure that partial credit is at least possible in some cases.
  • The test engine itself is new since the last time I took an exam and seemed to work well with one exception.  An exam may be made up of a combination of “reviewable” questions in which you can go back and “non-reviewable” questions in which you cannot go back.  I’m sure there’s an intelligent reason for this.  The way the engine handled this, however, is to go through the reviewable questions first, then allow for the standard review and change process, then proceed to the non-reviewable questions with the remaining time — only it really didn’t do a good job of explaining this process.  Instead, I saw the question count, such as “50 out of 60,” hit next, then it proceeded to the review screen, which I thought was a glitch in the question count, no big deal…so I spent a bit too much time reviewing, only to have it launch into the next set of questions with not much time remaining once I finished review.  It’d work a bit better if the non-reviewable questions came first or it gave a can’t-miss screen of explanation to allow for better budgeting of time.
  • The question types are made available here .  Not all types will necessarily be present in all exams and the Build List is definitely a type to be very familiar with, as it seems quite popular.

 

 

I’m currently eagerly awaiting my results, and probably will be still for some time as the beta period has just closed.  Should I not pass, I look forward to attempting it again, possibly in the new online proctored format, which I’m curious to try out.

 

SQL Saturday Roundup: #560 – Charlotte, NC

SQL Saturday 560 - Charlotte, NCThis past Saturday marked Charlotte’s 5th consecutive SQL Saturday event, and my 4th in Charlotte (I missed last year’s due to a scheduling conflict) — SQL Saturday # 560.  Also fun, this happened to mark the 20th SQL Saturday that I attended over the past few years — a pretty cool achievement!

As a change to this year over previous years, this year SQL Saturday Charlotte offered 2 all-day precons on Friday, both of which would sell out.  With a “go big or go home” type attitude, they made life very difficult and forced me to choose between two incredible spears and presentations:  Adam Machanic’s (B | T) “Tuning Your Biggest Queries” and Jen Underwood’s (B | T) “Advanced Analytics”  With two fantastic speakers to choose from, each with a great, and relevant, session, I had some decisions to make.

After an easy drive from north Atlanta to Charlotte after work on Thursday, I headed out bright and early Friday morning to attend Jen Underwood’s (B | T)  “Advanced Analytics”  The precons were held in an off-campus location, the Ballantyne Center, which worked out to be a pretty decent venue.  Adam’s precon was held in a standard classroom and Jen’s precon was held in a computer lab, which provided each person a computer with a solid network connection and the ability to follow along with the presentation slides and run the lab exercises locally.

Jen Underwood presenting Advanced Analytics

Jen Underwood presenting Advanced Analytics

Jen’s Analytics precon was an excellent session and a great introduction to the topic.  The topic is far too deep to cover in a single day session, but Jen did a good job of hitting the highlights and getting people introduced to the material and inspired by the possibilities of the subject with plenty of resources for further learning.  This was certainly time well spent!

Continue reading

SQL Saturday Roundup: #521 – Atlanta, GA

SQL Saturday Roundup: #521 – Atlanta, GASQL Saturday 521

This past Saturday marked the annual event that many of us look forward to all year long.  Called by some to be the “Summit of the South,” SQL Saturday Atlanta is always a large and very professionally run event which draws hordes of SQL Server Professionals nationwide.  Again, this year, the event did not disappoint.  Despite the initial overcast and rainy weather which opened up the day, a record number of 590 attendees attended this awesome event!

For the second year in a row, I volunteered my time to help make the event as great as it could be.  For the first time in the past few years, I was did not attend any of the event’s preconference sessions, which typically run all day on Friday.  That being said, the sessions that were being given were excellent topics which were presented by some truly great speakers.  I’m sure everyone who attended learned a ton and had a great time.

While I didn’t attend any preconference sessions on Friday, SQL Saturday Atlanta did still begin for me on Friday when I headed over to the site Friday afternoon to begin helping with prep.  We had a great group of organizers and volunteers onsite to perform the bag stuffing and venue setup.  Additionally, this year, the decision was made to pre-print all of the attendee admission tickets, name badges, and raffle tickets (rather than relying on attendees to pre-print their items at home prior to the event to bring them with).  This allowed for nicer name badges and perforated raffle tickets to be used, but was a very laborious process collating and assembling the packets for the morning’s registration.  I ended up spending all of my time Friday afternoon helping out with the registration packet assembly process.

Continue reading

Older posts

© 2017 Jeff Pries

Theme by Anders NorenUp ↑