Recently, I had the opportunity to participate in the beta period for the new Microsoft 70-768, “Developing SQL Data Models” exam. As part of the development process for new exams, Microsoft periodically offers invitations to take a beta version of the exam free of charge. Passing an exam while in the beta period results in passing the test officially when it is released. However, there are a few catches to this. The first of which is that these betas are designed to be taken by people who already have knowledge of a subject and work with it on a regular basis. Since the exam is brand new, there are no official study materials to study, and there is typically a short window of time between the announcement and close of the beta period, so there isn’t a lot of time to prepare. These exams also aren’t for those that feel the need for instant gratification, as it may take a month or longer to receive score results to ultimately find out if you passed or failed. But, the wait is for a good cause as the beta exams are used to determine the final versions of the exam as well as passing scores, which must be completed before the beta exams can the go through the scoring process to be officially scored.
For those interested in taking a beta exam and qualified to do so, I’ve found that Microsoft usually announces the availability of a limited number of seats via the Born to Learn blog which are available until the allocated number of exams are scheduled. For this last batch of SQL Server beta exams, 300 seats were made available for each exam. This may sound like a lot, but this is worldwide, so you have to act fast. I’ve found that if you don’t act within the first day or maybe two of the announcement, you will probably be too late. Once announced, you typically have to have the exam scheduled in a proctored environment within around a month of the announcement.
Like all Microsoft certification exams, the beta exams are covered by an NDA protecting the exam content, so I won’t go into exam specifics, but I do have a few general thoughts about the exam and process. It is interesting to see how the Microsoft exams in general have changed and matured over the years.
- I’d classify this exam as “tough, but fair” with a heavy amount of reading. I underestimated the amount of reading, and was thoroughly exhausted at the conclusion of the exam.
- Due to the length of the questions and answers, and re-reading questions and answers, I used almost all of my time allocated for the exam.
- The published Exam Objectives were very representative of the exam content (shockingly so) and make a very good framework for preparing for the exam.
- How an exam is scored is always a black box, but some questions do make mention of how they award partial credit, such as “1 point per part of correct answer” which I found to be a welcome change — nice to know for sure that partial credit is at least possible in some cases.
- The test engine itself is new since the last time I took an exam and seemed to work well with one exception. An exam may be made up of a combination of “reviewable” questions in which you can go back and “non-reviewable” questions in which you cannot go back. I’m sure there’s an intelligent reason for this. The way the engine handled this, however, is to go through the reviewable questions first, then allow for the standard review and change process, then proceed to the non-reviewable questions with the remaining time — only it really didn’t do a good job of explaining this process. Instead, I saw the question count, such as “50 out of 60,” hit next, then it proceeded to the review screen, which I thought was a glitch in the question count, no big deal…so I spent a bit too much time reviewing, only to have it launch into the next set of questions with not much time remaining once I finished review. It’d work a bit better if the non-reviewable questions came first or it gave a can’t-miss screen of explanation to allow for better budgeting of time.
- The question types are made available here . Not all types will necessarily be present in all exams and the Build List is definitely a type to be very familiar with, as it seems quite popular.
I’m currently eagerly awaiting my results, and probably will be still for some time as the beta period has just closed. Should I not pass, I look forward to attempting it again, possibly in the new online proctored format, which I’m curious to try out.
This past Saturday marked Charlotte’s 5th consecutive SQL Saturday event, and my 4th in Charlotte (I missed last year’s due to a scheduling conflict) — SQL Saturday # 560. Also fun, this happened to mark the 20th SQL Saturday that I attended over the past few years — a pretty cool achievement!
As a change to this year over previous years, this year SQL Saturday Charlotte offered 2 all-day precons on Friday, both of which would sell out. With a “go big or go home” type attitude, they made life very difficult and forced me to choose between two incredible spears and presentations: Adam Machanic’s (B | T) “Tuning Your Biggest Queries” and Jen Underwood’s (B | T) “Advanced Analytics” With two fantastic speakers to choose from, each with a great, and relevant, session, I had some decisions to make.
After an easy drive from north Atlanta to Charlotte after work on Thursday, I headed out bright and early Friday morning to attend Jen Underwood’s (B | T) “Advanced Analytics” The precons were held in an off-campus location, the Ballantyne Center, which worked out to be a pretty decent venue. Adam’s precon was held in a standard classroom and Jen’s precon was held in a computer lab, which provided each person a computer with a solid network connection and the ability to follow along with the presentation slides and run the lab exercises locally.
Jen Underwood presenting Advanced Analytics
Jen’s Analytics precon was an excellent session and a great introduction to the topic. The topic is far too deep to cover in a single day session, but Jen did a good job of hitting the highlights and getting people introduced to the material and inspired by the possibilities of the subject with plenty of resources for further learning. This was certainly time well spent!
SQL Saturday Roundup: #521 – Atlanta, GA
This past Saturday marked the annual event that many of us look forward to all year long. Called by some to be the “Summit of the South,” SQL Saturday Atlanta is always a large and very professionally run event which draws hordes of SQL Server Professionals nationwide. Again, this year, the event did not disappoint. Despite the initial overcast and rainy weather which opened up the day, a record number of 590 attendees attended this awesome event!
For the second year in a row, I volunteered my time to help make the event as great as it could be. For the first time in the past few years, I was did not attend any of the event’s preconference sessions, which typically run all day on Friday. That being said, the sessions that were being given were excellent topics which were presented by some truly great speakers. I’m sure everyone who attended learned a ton and had a great time.
While I didn’t attend any preconference sessions on Friday, SQL Saturday Atlanta did still begin for me on Friday when I headed over to the site Friday afternoon to begin helping with prep. We had a great group of organizers and volunteers onsite to perform the bag stuffing and venue setup. Additionally, this year, the decision was made to pre-print all of the attendee admission tickets, name badges, and raffle tickets (rather than relying on attendees to pre-print their items at home prior to the event to bring them with). This allowed for nicer name badges and perforated raffle tickets to be used, but was a very laborious process collating and assembling the packets for the morning’s registration. I ended up spending all of my time Friday afternoon helping out with the registration packet assembly process.
Recently, I experienced an issue trying to get a new installation of Microsoft Excel 2016 to connect to SQL Server 2014 Analysis Services Multidimensional. Whether I would try to connect directly from Excel, via the “From Other Sources” menu or via the Cube Browser in SQL Server Management Studio, I would receive an error that Excel was unable to connect to the database.
I verified that the service was running, firewall not blocking it, and credentials good. Read on for the solution to this issue.
Most introductions start with an explanation of cubes and MDX and then work their way toward the syntax and structure of a query and then start to cover the various syntax options in increasing difficulty. While that is a great approach, sometimes you just need to get going right away. Fortunately, MDX actually has a few good visual methods of querying an existing cube to access data without necessarily needing to know the syntax and theory (however, that doesn’t hurt!). This post is going to cover a few of the visual methods of querying a multidimensional analysis services cube.
All that being said, it does still help to understand the structure of your data and the various relationships before trying to query it. I’m using the Adventure Works 2014 Data Warehouse and Analysis Services data sets (previously discussed here). The Adventure Works 2014 Data Warehouse contains a number of different fact tables and related dimensions. I primarily use FactInternetSales (and its related dimensions) which is data for the Internet Sales of Adventure Works. The diagram for the relational data warehouse database can be seen below.
Adventure Works 2014 Data Warehouse Internet Sales diagram
This post serves as a full reference of all of the resources I used while on my journey to learn MDX / SSAS Multidimensional. I’ll try to keep this post updated with an index of resources as I consume them. Some will be free, some will not, I’ll note which are which.
Recently I had the need to start learning MDX to query against an existing Microsoft SQL Server Analysis Services (SSAS) cube. As this can be a long and difficult journey, I thought it’d be useful to make notes about specific things I learn as well as to list and review resources I find along away. I will be paying special attention to using MDX with SQL Server Reporting Services (SSRS) as this is a very common usage scenario, but one in which there are sparse learning resources.
I’ll try to keep this post updated with an index of resources as I consume them. Some will be free, some will not, I’ll note which are which. But first, some housekeeping:
I’ve been known, from time to time, to execute a fun and harmless over-the-top creative office prank. With my upcoming departure from my current organization, I thought it might be fun to launch one last bit of fun, timed to be found at some point after i’d left.
I’d always liked the idea of hiding something (non-evil) in my desk to be found at some point by a future inhabitant of the desk. I’d also heard of someone else calling the job documentation they were leaving behind a “treasure map.” So, that’s where the idea was born — why not leave behind an actual treasure map to an actual treasure? And if a treasure map is going to be created, then of course we’re going to need a pirate theme!
I had grand plans and ideas on how to make a fairly involved hunt — but, due to time restrictions and actually wanting this to be found, I decided on a simpler approach to a basic map and instructions which led directly to buried treasure. I had one other speed bump encountered at the last minute — my office and desk were to be reworked the day I was leaving, so leaving clues strategically placed there wasn’t going to work.
Never fear! I managed to leave a small clue in a common area of the office (which managed to go unnoticed for a week) which then led treasure hunters to the map, which ultimately led to the prize hidden outside in the woods. The hunt was great fun and was a great success! Hopefully it was enjoyed by all.
The unfolded treasure map with its bottle.
SQL Saturday Roundup: #477 – Atlanta, GA (BI Edition)
This past Saturday marked a great landmark for Atlanta — the first “BI Edition” SQL Saturday, and, for many of us, our first SQL Saturday of 2016! Atlanta has been hosting a regular SQL Saturday event for many years now, always with tremendous attendance. Based on the amount of interest in the regular SQL Saturday (usually around May of each year), it was great to see a BI focused edition launched.
With the regular event in the Spring, having this event in January was a great way to space out the two Atlanta events probably about as equally as they could be. And, for a first event, it seemed to be a tremendous success with a registration wait list and around 300 in attendance.
The event was held at the local Microsoft facility in Alpharetta, GA, where the monthly Atlanta MDF user group meetings are held. All told, the facility was a pretty good choice of venue (and definitely a convenient location), but suffered from some overcrowding. The facility had about half the sessions in roughly classroom sized rooms and half the sessions in much smaller conference room sized rooms. While these smaller rooms made for an interesting and more intimate setting, they ultimately filled up very quickly.
Unlike many SQL Saturdays, the event kicked off with an opening keynote and presentation in the large room (multiple rooms joined together technically). I enjoy it when a SQL Saturday begins with some sort of all-attendee opening remarks, it provides nice symmetry to the event (which always has a final closing remarks session), so hopefully more events will adopt this.
Dandy Weyn presenting the opening remarks at SQL Saturday Atlanta 2016, BI Edition.
2016 is here at last, and it’s already shaping up to be an exciting year. Most notably, after 12 years, I decided to put the Professional Development skills that I’d been honing over the past few years into practice and start the year off with a bang — by interviewing for and accepting a Business Intelligence position with another organization.
I’m not going to pretend it was easy, leaving a stable job with an amazing organization after so many years, but it was a necessary one. And, I won’t say that more than once, I didn’t ask myself if I was being crazy. Personality types such as my own value and prioritize security and stability, sometimes to our detriment, to where we find ourselves in situations which are so stable and comfortable that it becomes difficult to challenge ourselves and grow and easy to fall into a routine. But challenge is crucial to continue growing, learning, and advancing. I’ll dearly miss the people at my previous organization, but I look forward to the adventures ahead of me.
So, here’s to a great, and definitely interesting, 2016. My challenge to you for 2016 — if you find yourself too comfortable and stable, find a way to challenge yourself in some new way, whatever it is. Make a plan, get out there, and do it! And don’t forget to document it in your Professional Development Plan!