Impressions From Our First Month Of Using #PowerBI

comparison


Introduction

It was about 1 month ago, when my journey (and many of my co-workers journies) into learning Power BI (PBI) began. Since then, I have tried a few things in PBI, but one of my good friends has tried a lot of things.

For this reason, we have decided to combine our experiences to write a series of articles that document our impressions, insights and lessons learned as we begin traveling down the path of learning PBI.

Caveat

Before you go any further, this article and any subsequent articles are going to be written from the perspective of professionals that work every day in the field of advanced analytics. The demands that we place on software are intense, real-world-based, and are based on decades of experience. In other words, we tend to hammer the software products that we use and our expectations are pretty high for software performance, robustness, and capabilities.

If you are looking for light-weight reviews, or reviews from a more diverse audience that includes relative newcomers to the field of quantitative and visual analytics, it is probably best if you go visit review sites like this one that compares different software packages. In a site like that, you will find people who are highly-influenced by differences in software pricing or other topics that are perceived to be important but probably are not.

The lessons that these people have yet to learn are many and simply indicate their lack of comprehension when it comes to understanding true software value. True value occurs when the software empowers you to work fast and accurately, without long, drawn-out learning curves and malformed or missing components. I could go on and on within this paragraph, but suffice it to say that price should not be a primary differentiator when making a software comparison like we will be doing. Software performance and capabilities are what matters and these feature comparisons are what we are after in this and subsequent articles.

When you are using analytics to chase millions of dollars of value for a company, a relatively small purchase price difference gets swallowed up every week in excessive work that is required, lost time due to missing software functionality, and inefficiencies in the workflows. These types of issues occur in software that is immature. In other words, you get what you pay for in software, just like in other products we buy.

Finally, we will be focused squarely in these articles on what we have learned while beginning our journey into the world of Power BI. Since we are new to PBI, we are unbiased except for what we have already discovered. As time goes by and we learn more about PBI functionality, we will do our best to update our findings contained within these articles.

With those things being said, I’ll complete the caveat by reminding readers that I have been negative, neutral, and positive on the two software firms that I generally review. Those firms are Tableau and Alteryx. There are numerous articles I have written that give insights for all three rating categories because I challenge these firms to do better when I think they can, and I also praise them when I think it is well-deserved.

I guarantee you that Microsoft will get the same type of treatment. They already have received a negative and positive review from me in the first two articles I have written about PBI (references provided below).

Finally, I have never made a single cent from this blog, so there is no financial motivation for me to write articles like this. I strive to be truthful in these reviews for the benefit of everyone interested in the topics.

I have spent thousands of hours over the past decade creating the ideas and techniques that are written in the blog. I have lost a lot of sleep to write these things, and I have spent a lot of money to have WordPress host this content. Therefore, I just want you to know that this information is being created free of charge to you and the spirit of the work is simply to help you learn about Power BI as it compares to using Alteryx and Tableau.

This series is my first look into what might be a serious challenger to Tableau as I have previous predicted on February 13, 2015:

These three gentlemen have the ability to understand how Tableau needs to change to remain important in the marketplace. Only time will tell how well Tableau does before the next revolution of analytics software emerges in an attempt to overthrow the leaders in the field. So far, Tableau has been doing great, but greatness doesn’t automatically last forever.


Meet My Partner in this Collaborative Work: Mr Ben Pope

Ben Pope is my friend, and he is one of the best Tableau users I have ever met. Ben is also a knowledgeable and experienced Alteryx user. Ben brings multiple skill sets to any analytics project that he is tasked to work on.

Ben combines knowledge in artistic design, with data science and advanced analytics and he produces some of the most beautiful and insightful dashboards I have ever had the pleasure of using. Ben is an outstanding user of Tableau, not only from the dashboard production side, but also from the data prep, calculations, and creative perspectives needed to produce great work. You will have to trust me when I tell you that Ben is a great guy for the job at hand.

Ben is thorough, insightful, rigorous and meticulous. Ben knows how to do data analytics and a whole bunch of other things. Last week, Ben and I were talking about the progress we (mostly him) have made in learning Power BI when he started listing for me the initial/early findings he has uncovered.

Of course, my brain immediately went into record mode because I instinctively knew that what he was telling me could potentially be useful to a lot of people. I recommended to Ben that he begin documenting his findings so that we could share them with other people. Deep inside my brain, I hoped that I could convince Ben to work with me in writing a series of articles, but I didn’t indicate that to him at the time.

Motivation for this Article

A few days after that, another highly talented Tableau user named Jonathan Drummey wrote to me and asked me if I had planned to write an article like this one on comparing Power BI to Tableau. I initially told him that I would NOT do it by myself because I had already determined that I was not going to write anything else about PBI. I have at least 20 other articles in various stages of draft mode for this blog and I’m eager to work on those.

Upon further reflection, however, I mentioned to Jonathan that I had a compadre that might be willing to work with me to supply some of the beef for the article, so to speak. So by taking information from one friend and combining it with an idea from another friend, this article was born because Ben agreed to help me. I am thankful that he did.

Limitations and Uncertainties in Our Findings Because We Are Newbies

Both Ben and I are new users of Power BI. For this reason, we are bound to get some things wrong, even as we strive to get everything right. When we do get them wrong, we would appreciate it if knowledgeable people would take the time to write comments on these articles so that we can get our mistakes corrected. We will make changes to the article(s) in response to constructive comments so that the article(s) is/are as factually correct as it/they can be.

When you are learning a new software tool, experience, usage, teachers and time are the things that make you better with the tool. Since we have certain biases in the way things are done in the tools that we are used to using, it is possible that we simply overlooked or did not know about some features of PBI that would have made it possible for us to accomplish what we were trying to do.

Additionally, since we are trying to apply PBI to real-world cases that have actual complexity as part of the project(s), there might be times when a little more PBI knowledge might be required to efficiently get some things done. What I mean by this is that our learning of PBI will be occurring in real-world situations that contain a normal level of complexity – we won’t be making simple comparison tests, in general.

Finally, we have a couple of expectation of PBI. If PBI is going to be a useful tool for us, it has to be robust and be capable of working in the real-world in a reasonable amount of time. Learning the tool should not take us years. We should become effective users in a few months. If we can’t learn PBI quickly, this series might abruptly come to an end.

Finally, we understand that PBI is a developing platform that is attempting to make-up ground with respect to competitors. This is obvious because Microsoft publishes frequent updates like the ones shown in Figure 1 (click for big picture). These represent about the past 3 months of published updates, so it looks like they are produced every 2 weeks. This can be compared to Tableau’s publication history, which as been smooth and stable over years.

images2

Figure 1 -Microsoft publishes frequent PBI updates. Some people think this is a good thing but I see it as a sign of an immature product that is rapidly being developed, which can easily lead to unstable software.


We will make a concerted effort to not punish PBI for its current deficiencies. However, we must speak the truth as it currently exists and search for the things that differentiate the tool capabilities so that people can make informed choices of which tools to use for the types of projects they need to complete. We promise to produce fair evaluations of Power BI and not get caught up in “he said/she said” arguments or to spew the propaganda of one company over another. We will be unbiased in what we say and do, to the best of our abilities.

Part 1 – Early Findings of Power BI Capabilities and/or Limitations While Using the Tool

Topic 1 – The Data Model Ideology

Ben likes the the idea of the data model in Power BI and having the ability to complete extract, transform and load (ETL) operations in one application. The required number of steps needed to complete the work, however, indicates to him that Alteryx data operations are far superior at this time.

Ken, on the other hand, does not like the data model concept of PBI. Ken prefers the data flow programming approach employed by Alteryx as he documented in this series of articles. Ken found the PBI data model design module to be clunky, unsightly, and hard to use.

This perspective is most likely a result of his experience using the Alteryx workflow canvas methodology, which is so cleverly designed, intuitive and logical that many other methods are pale by comparison.

A new user of PBI might not have the same feelings and they would be OK using a step by step instruction list like the one that is produced by PBI. If you learn that you have to manually connect data fields together by drawing lines between them, then this is what you get used to, no matter how inefficient it is. The flow based programming approach of Alteryx is one of those features that PBI users are probably not aware of, and therefore they wouldn’t see the data model concept as being inconvenient.

Topic 2 – Data Cleaning, Blending and Preparation

On the flip side, Ben does not like having to do some of the ETL in the Power Query module (which uses M) and then do other data enrichment and calculated columns in the Table or report view using DAX.  Ben finds that going back and forth between these environments is cumbersome. In fact, during recent training we received, the trainers were having a hard time explaining to us when M is used and when DAX is used. If an experienced trainer is having trouble with this software design, that is a good indicator that the working methodology is not logical or optimal.

Ben also believes that this approach forces users to learn a lot of syntax and do a lot of custom programming to accomplish things that are commonplace in typical analytics projects. This is a loaded statement and there will be many examples given to support that last sentence. What this means in general, however, is that not only do you have to tell PBI what you want to do with the data, you always have to tell it how to do the operations. You will have to write a lot of calculations to do things that are available to you in Tableau.

Ken agrees with this finding for a couple of reasons.

First, in his first six years of using Tableau, Ken did ETL operations in Tableau, VEdit, regular expressions, custom programs and Excel. He was used to having to use a multitude of environments to prepare data for visualization. Ken was used to working this way.

Second, once Ken began using Alteryx, he learned to use Alteryx to handle all data complexities before sending the single completed data set to Tableau for visual analysis. Alteryx replaced all the other ETL tools in total and allowed Ken to focus on what to do with the data, not how to do all of the individual operations that you have to specify in PBI. Ken has found this method of working to be the most robust, fun and efficient methodology he has ever used.

The flow based programming approach of Alteryx is completely executed on one drawing canvas, and a Tableau data extract file is created by Alteryx every time. Ken much prefers this method of working, which clearly separates data cleaning, blending and preparation from data visualization. Each component of work is efficiently completed, documented, and is instantly expandable and repeatable. When Alteryx and Tableau are used in tandem, you get to use the best of both products in a highly optimized fashion, with no ambiquity or loss of efficiency that occurs when switching between multiple applications for data preparation.

Topic 3 – Handling of Dates

Ben does not like how Power BI handles dates. The concept of having to create and then define the date relationships to be able to create date hierarchies, sorting and calculations is an old approach compared to how Tableau handles dates.

For example, Ben has a text field that contains date information including the quarter and year, like “Q2 2016”.  He needed to covert this field to an actual date data type.  In PBI, he had to spit the field into quarter and year, create a date table, relate the date table to my data table and convert the strings to a date field using this calculation;

“Date = if([Quarter] = “Q1”,date([Year],1,1), if([Quarter] = “Q2”,date([Year],4,1), if([Quarter] = “Q3”,date([Year],7,1),  date([Year],10,1))))”

In comparison, the date conversion calculation he used in Tableau was “dateparse(‘qqqyyyy’,[Period])”.

Ken completely agrees with Ben on this topic and Ken has previously written that the handling of dates in Tableau is one of its greatest strengths. Alteryx, of course, has an equally robust set of date handling functions. It is very clear to Ken that the date handling methodology currently used by Power BI is a serious deficiency that needs to be rectified sooner rather than later. There is no reason why Microsoft should be requiring its PBI users to create date data tables to be able to quantitatively operate on date fields.

Part 2 – Specific Insights That Compare Tableau Functionality to Power BI

There are a number of things that Ben feels like he has taken for granted in Tableau but cannot easily resolve in PBI. These insights were developed as Ben has applied PBI to a real-world problem. Some of his output from this work will be shown in an upcoming section.

The problematic PBI features or items include:

  1. Cannot arrange columns or rows by drag and drop. You have to either create a calculated field or related table to use as a sorting key.
  2. Cannot create attribute groups by selection. You have to either create a calculated field or related table to create the groups.
  3. Cannot hide row or column labels. You have to collapse them.
  4. Cannot disable sorting by users.
  5. Cannot have colored KPI shapes, but you can create KPI shapes using UNC characters.
  6. Cannot alias attributes and headers without changing the underlying data.
  7. Conversion of date stings like “Q2 2016” and working with dates in general is much more complicated than it should be.
  8. Cannot have custom marks.
  9. Cannot have dual axis charts except for the supplied line and bar vizualizations.
  10. Limited undo.
  11. All the viz controls are affected through the right hand visualizations or fields panels, for someone used to selecting things in the viz, this gets annoying.
  12. Tables can be conditionally formatted (heat map), but a matrix cannot.
  13. Conditional formatting in tables, has to be based on the value of the cell to be formatted and only measures values can be conditionally formatted.
  14. PBI frequently crashes.
  15. Data mark limitations (currently 3,500 but might be increasing?).
  16. Cannot copy and paste a specific value or range of values from a viz view.
  17. Cannot create sets from a viz view.

The net result from Ben’s first month of using PBI can be summarized as both positive and negative. On the positive side, he has been able to use PBI to complete the rather extensive ETL data work that his project required, even though some of it was a bit awkward for him to learn and complete. On the negative side, he has not been able to make some of the required data visualizations or easily export the data once the ETL operations were completed. This last statement also was independently related to me in an email from another co-worker just an hour ago.

From the Tableau point of view, Ben was able to make the visualizations he needed, but he had to do the data prep in some other software package(s) or using other programming tools or techniques. In other words, Tableau’s ETL operations were not sufficient for the full range of data work needed in this example.

In the following section, some of the required visualizations Ben created are shown. He also explains how these were created in Tableau, since they could not be created in PBI.

Part 3 – Required Visualizations That Ben Could Not Create in PBI

The examples shown below are intentionally masked. These are not being shown for full data comprehension, but rather are illustrative examples of the types of plots that Ben was not able to create in PBI for his project but he could do so in Tableau.

Example 1:

Conditionally formatting a matrix table cell with text and symbols based on another measure value, as shown in Figure 2. Each cell shows the measure for a quarterly metric, adds an indicator if it was over as threshold change from the previous quarter and then colors the cell based on that quarter’s performance to a benchmark.

matrix

Figure 2 – Conditionally formatting a matrix table cell with text and symbols based on another measure value (click for full size image).


Example 2

The example shown in Figure 3 is an indicator for sparkline trend viz aligned and synced with another sparkline viz. A sparkline viz that varies the lines weight by performance to benchmark, colored by change from previous quarter and labeled with change indicator symbols based on a exceeding a threshold.

spark1

Figure 3 – This is an indicator for sparkline trend viz aligned and synced with another sparkline viz.


 

The trend indicator viz was something Ben learned how to do today. The trend line generated by Tableau was too hard to discern in the sparklines, so he wanted an overall indicator based on the sharpness of the trend line slope. Ben needed a value he could consume to use the trend slope as a variable and while Tableau shows it in the tool tip, it is not exposed. Ben found he could calculate it with R, but he does not have R on his computer and he can’t get it installed due to corporate policy.

After a bit of Google digging, Ben found an answer in a Tableau knowledge base article that solves this using level of detail (LoD) expressions and does not need R. The results match the results shown in the trend line tooltip, so he verified that the formulation is correct. Bens said he would not have come up with this on my own and thought it was a great piece of work by Joey Minix, as shown below.

// This is based on https://community.tableau.com/message/184294 answer by Joey Minix

//y = αx + β

//where α = (n∑(xy) -∑x ∑y) / (n∑x^2 -(∑x)^2 )

({fixed[Measure],[Country],[Brand]:count([Quarter Date])}*{fixed[Measure],[Country],[Brand]:sum(([Score]-[Benchmark Score])*[floatdate])} –

{fixed[Measure],[Country],[Brand]:sum([floatdate])}*{fixed[Measure],[Country],[Brand]:sum([Score]-[Benchmark Score])}) /

({fixed[Measure],[Country],[Brand]:count([Quarter Date])}*{fixed[Measure],[Country],[Brand]:sum(power([floatdate],2))}-power({fixed[Measure],[Country],[Brand]:sum([floatdate])},2))

That’s one heck of an LoD expression!


 

Example 3:

Here’s another variation on the previous chart based on an Interworks article that he thought was neat. It plot has a dual axis with custom labels and marks as shown in Figure 4. Dimensions are intentionally omitted from Figure 4.

spark2

Figure 4 – This is a dual axis plot with custom labels and marks


 

This chart is based on the follwoing article:  https://www.interworks.com/blog/rrouse/2015/12/14/new-way-visualize-kpis-tableau

Part 4 – Specific Comparisons of Analytic Techniques

Ken decided to try specific comparisons of certain data reading and preparation techniques to see how PBI compared to Alteryx. So far, his testing is limited but many more examples are planned for our next article.

Example 1 – Reading Undelimited Flat Files

The undelimited (fixed-width) flat file parsing benchmark, which was published in this article, was going to be used for my first test case for PBI.

There was an excellent reason for this project to be completed. The reason is that VB .net was used as a technique to parse the file and the benchmark results were shown in the article (Figure 5).

This would have been a great example to see how far Microsoft has come in developing new technologies for connecting to a still-important data source. Ken has used undelimited flat files for topics ranging from global climate data, to audited financial data, to medical data, and beyond.

undelimited

Figure 5 – Benchmark results for reading an undelimited flat-file.


 

To begin the test, I had to learn how PBI reads undelimited flat-files.  I conducted a general search (Figure 6), and the results were underwhelming. I did a search in Power BI itself and found no results. So this meant that I had to begin experimenting to see if I could get Power BI to read an undelimited flat file.

general_search

Figure 6 – The General Search results for fixed-length files (i.e., undelimited flat-files).


 

So Ken did a little more searching and found this Microsoft Power BI article (Figure 7)

flat_files_fixed

Figure 7 – It doesn’t look promising for PBI to be able to read this type of file. This topic is a needed area of improvement for PBI.


 

Additional user comments were found to this article (Figure 8). It looks like other people would like PBI to have this capability.

flat_files_fixed2

Figure 8 – Comments on the suggestion.


 

During my experimentation, Power BI failed with every attempt I made. I used an example file shown in Figure 6 for the testing.

example_undelimited_file

Figure 9 – The example undelimited flat file I sent to PBI to see if it could read it.


 

When loaded into PowerBI, it automatically found the breakpoints in the file, which do not correspond to the actual data stored in the file. Figure 10 shows how PBI interpreted the results.

initial_parsing_by_powerbi

Figure 10 – The initial attempt by PBI to pick the breakpoints in this file.


 

I tried even the most advanced options for parsing these fields, to no avail, as shown in Figure 11. It appears to me that PBI cannot read undelimited files at this time, so the benchmark comparisons could not be completed.

The net result: Power BI was not even able to compete – it totally failed. In other words, little did I know that there would be no Power in the Power BI engine.

advanced-splitting-of-columns

Figure 11 – The advanced slitting of columns that is available in PBI. This is not sufficient for reading undelimited flat files.


Part 5 – Previous Work on The Topic of Power BI, or Inspired by Power BI

  1. Impressions From My First 5 Minutes Of Using #PowerBI
  2. Impressions From My First Day Of Using #PowerBI
  3. How Microsoft Could Put Real Power into Power BI
  4. How To Achieve Better Data Comprehension, Part 1
  5. How to Achieve Better Data Comprehension, Part 2
  6. Test 1 of Tableau vs Power BI: Topographic Mapping

Upcoming in the Next Article of this Series

From Ben’s point of view, we should be learning some more insights into how Power BI is different than Tableau. From Ken’s point of view, we should see some direct comparisons of Power BI to Alteryx.  Let the games continue…

9 thoughts on “Impressions From Our First Month Of Using #PowerBI

  1. Thank you Ken and Ben! This is insightful and I love your approach and the depth of analysis. I’ve only used Power BI a little and there are things that I immediately like and things that I don’t. For me, one of the most striking differences from Tableau is the paradigm itself.

    (I’m definitely open to the fact that I may have a wrong first impression of Power BI, so feel free to correct me if I’m wrong)

    In Tableau, you create almost _any_ visualization using a single canvas and a defined set of behaviors that translate your drag and drop of fields into a query and then a visualization. There’s not a separate set of drop zones for a bar chart versus a heat map. You have the same shelves & marks for everything. It’s not always intuitive (although it is mostly) but as you become more in tune with the paradigm, you find you can create almost anything.

    With Power BI, you have to select a predefined canvas (bar chart, bar chart with lines, etc…) and then fill in the placement of fields for that specific canvas. Yes, you can change the chart type and Power BI does it’s best to re-arrange the fields, but it is a different canvas. Now, this makes some things easier (e.g. you know you want a Sankey diagram – just download that viz type for Power BI and fill it in; not so easy in Tableau). And I wouldn’t argue that it’s not intuitive. But it does make some things harder.

    While the Power BI paradigm of “pick-your-viz-then-fill-in-data” works well when you know what you want (I want a dashboard with a bar chart here and a line chart there) – in the real world, you often don’t. That’s where Tableau shines. It makes the first parts of the cycle (data exploration and analysis) relatively easy (and fun!) because you can very easily iterate and flow through various different ways of seeing the data. And Tableau isn’t too shabby at the later parts of the cycle either (visualization, dashboards, communication, data storytelling).

    I look forward to the rest of your series!

    • Hi Josh,

      It’s great to hear from you. Thanks for writing such an insightful response. Your perspective on the paradigm are one of the reasons I felt like I was in groundhog day during the first day of training. I felt lost when I had to pick the viz and then find out the data connections were not there to support the viz.

      When you get used to the efficiencies that Tableau provides us, it is hard to go back to the old ways of working and thinking. The whole experience reminded me of the days of creating graphics in Excel. I want to use A1:A100 and C1:C100 to create this chart. I need D1:D100 to label the marks and E1:E100 for the legends, blah blah blah.

      There are some things that Microsoft is doing that may be better than Tableau, however, and I hope to uncover those in the upcoming articles. I think we will all be learning some new tricks from this experiment.

      Thanks again for writing,

      Ken

  2. The 3500 mark limitation seems awfully low to me. I’ve just made a dashboard in Tableau with 27,000 so would I not be able to create that in PowerBI?

    The start with data and create a chart approach that Tableau adopts lends itself to creating the best chart for the data. If you start with the chart first then fit the data to it you end up picking the chart type first. Now, that’s fine you know the good practice for which chart type is the optimum, but if you don’t then you could easily end up using the wrong one, or at least not a good one.

    Would love to see what PoweBI does well and better, always good to give Tableau something to aim at.

    Thanks so much for doing this review.

  3. Thanks Ken and Ben, really appreciate the in-depth analysis and insights. I’m starting to hear rumors in our organization of Power BI replacing Tableau for cost/budget reasons. Are there any compelling high-level arguments you have found that resonate with executives making those decisions?

    As you know, I’m an avid Tableau and Alteryx user. I have been trying to install Power BI to check it out, but have not yet succeeded in getting it to load. I want to check it out for myself, but am also interested in analysis and arguments that support the use of tools like Tableau and Alteryx beyond the price tag and enterprise license agreement issues. Are there business case and productivity studies that support the cause, or will you be addressing that more directly in this series?

  4. i found most of your comments in the post are biased because of knowing alteryx and tableau very well and not knowing the pbi enough. if your trainer couldnt explain why and when to use m or dax then I say he is not the right one. Who is the right one? Well, I advise you to follow chris webb for M and italians sqlbi.com for dax. You’ll learn a lot and I believe you’ll change your first insights.

    best,

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.