Reflections of the Past and a Vision of a #Tableau Future

Introduction

A few days ago, I published an example of of drawing a finite-element grid in Tableau.  I thought that it would be just another post that would have very little readership. I figured it would be just another post that gets buried in my archives until my blog goes away.  For some funny reason, however, this post created some dialog and has ignited a spark of creativity in my brain that I need to put onto paper. Hence, this blog post was born and now I have to say what I need to say before my brain moves onto something else.


If you like this article and would like to see more of what I write, please subscribe to my blog by taking 5 seconds to enter your email address below. It is free and it motivates me to continue writing, so thanks!

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.


One of my favorite phrases I heard many years ago as a geology student is that “ontogeny recapitulates phylogeny” (ORP)That phrase just sounds cool. As a young man, I remember using that phrase at strategic times just to make it look like I knew something, when in reality I was clueless. I believed back then that this meant that children are likely to be just like their parents.

The theory is much deeper than that and involves embryonic development stages and blah, blah, blah. Now in the biologic field, this theory has largely been discredited. However, recapitulation theories still exist in other fields like language and cognitive development. These theories suggest that as new things develop they carry with them signatures or aspects of their past. Software is a lot like that. Products get built to replace other products and the cycle goes on over time.

As I think of the future for software products, I can see looking backwards that sometimes new products emerge the are enhanced versions of existing products. New products replace old products but they are in many ways built upon the ideas and concepts of other older products.

When Samsung releases a new smartphone, they say it is the “next best thing”. Their new smartphone is built upon the lessons learned from their previous phones. In software, updates are released to enhance the software capability and user experience, but every once in a while an entirely new product emerges that changes the playing field entirely. For me, that product is Tableau software and here are my thoughts of what this product could be in the future.

Reflections of the Past

Looking back through the years, it is amazing to me what we can now do with software and computers. The first finite-element grids I drew were done with pencil and graph paper. I remember designing a grid while traveling through airports in the mid 1980’s and having to tape together sheets of paper to complete the job.

I recently found a hand-drawn 2-D grid I used to solve one of my first groundwater flow problems at a nuclear site. That problem had hundreds of nodes and elements. When I was finished with the grid, I had to manually type in the x and y coordinates and then the element and nodal connections. ASCII editors were used for this work back in the days of MS-DOS, before Windows burst on to the scene. We got the job done but it wasn’t fast and it wasn’t that much fun to work that way.

Fast forward 20 years and I was building 3D finite-element and finite-difference grids with many millions of nodes and elements (or cells) with the help of some great software tools. During those 20 years, I helped develop boundary element methods for the automatic generation of finite-element grids. I developed my own codes for creating site-specific grids with irregular domains like the example shown in the finite-element blog post.

I developed grids and numerical models for everything from helping to design and build locks and dams on the Mississippi river to simulating groundwater flow around the subsurface linear accelerator (linac) of the Spallation Neutron Source in Oak Ridge, TN (Figures 1 – 3). This work would never have been possible without the development of the grid generators. Great software was built to replace a previously manual process and advanced the science and allowed for the development of creative numerical model simulations.

Figure 1 - The Spallation Neutron Source in Oak Ridge, TN.

Figure 1 – The Spallation Neutron Source in Oak Ridge, TN.


 

FE_Grid_Spallation

Figure 2 – The finite-difference grid used for the Spallation Neutron Source Site.


 

Figure 3 - A view down the linac line during construction. If I were to stand in the same place now, I'd be history, immediately.

Figure 3 – A view down the linac line during construction. If I were to stand in the same place now, I’d be history, immediately because either the magnetic field or the neutron beam would dispense of me. We are literally looking down the barrel of a gun that shoots neutrons near the speed of light!


 

Detailed finite-element and finite-difference grids were only part of the story. Once you built the model and simulated the groundwater system, you had to have a way to visualize the results. Since software tools didn’t exist for the numerical model codes I was using, I had to write my own and this took a lot of effort.

I developed contouring algorithms for the rapid processing of the model results.  I built a general graphical post-processors that could be used for multiple computational models for displaying contours, flow vectors, and time series of simulated variables.  At the time, these codes were built with compilers, linkers, and external math and graphical libraries.

Since what I was doing was at the forefront of development, I had to work closely with the software teams that were building the graphical libraries I was using. We had to do optimization work to get things running fast enough because the computers were limited in memory and computational speed. These tools had to be created because no general software tools were able to do what we wanted to do.

We had to have vision and creativity to solve problems by writing custom codes for the models we used. We tried to generalize them to work with a lot of different models but we quickly found out that it was a hard thing to do because each model had its own output data types and formats and these were changing all the time.

Therefore, to have a general software tool like this you were in a constant state of development which would slowly, surely grind you to a halt. We realized that there had to be a better way than writing a tool that had to maintain a connection to the program that created the output for analysis.

As the years ticked by, a few products for graphical analysis of model results began to grow its user base. Tecplot, a graphical processing engine which was born out of the need to visualize computational fluid dynamics results, began to be used for groundwater and surface water models (Figure 4).

Figure 1 - Tecplot can now work with 1 billion cells.

Figure 4 – Tecplot can now work with 1 billion cells.


 

Visual-Modflow, Groundwater Vistas, GMS, and other programs were developed over many years to give us the ability to visualize our scientific data. Certain groundwater models gained traction and their accompanying graphical post-processors grew in complexity. For example, the model Modflow-Surfact model emerged as a very capable simulation tool and it was initially compatible with Tecplot and Groundwater-Vistas for graphical processing of model results.  Now several years later,  it is compatible with other graphical post-processing engines, or Graphical User Interfaces (GUIs) as shown in Figure 5.

Modflow-surfact

Figure 5 – Modflow-Surfact fact sheet showing which GUI’s it is compatible with.


 

The primary problem with this approach is that all these GUI’s have their own work-flow paradigms and require insider knowledge to get them to work well. There are no standards for visual best practices, nor are there standards for how the graphics are produced. Significant learning curves exist for these tools and there isn’t a lot of consistency to how things are done. Since these GUIs are capable of handling 3D and 4D (time) data, you have to work with them frequently to get into a good workflow because the way that they handle the third and fourth dimension varies so much. In other words, these tools are not intuitively obvious to use.

I recently went to use the GMS GUI tool, which I was a technical reviewer for many years ago and worked with it extensively for many years. When I tried to use the tool, I was lost. It took hours for me to do anything useful.  This tells me that the design of this software is flawed. There is too much insider knowledge needed and not enough thought went into making the tool easy to use. Eventually this product will have to change or it will be supplanted by another, easier to use tool.

With all that being said, when I first saw Tableau back in 2008, I was stunned by its intuitive design.  After years of grinding through the process of writing custom codes and using time-consuming Excel graphics, I was entranced with Tableau.  I immediately knew which path my software future would be taking. Hence, I have used Tableau nearly every work day since that time, and many days on the weekends.

Thoughts of a Tableau Future

I wish we could get the Tableau software company to realize that their software framework, with some minor modifications and expansion of graphical offerings, could explode onto the scientific visualization scene. I can clearly see the day when “Tableau Scientific” is released (Figure 6). This product would instantly gain traction because of its intuitive design and flexibility with respect to connecting to data sources. If this tool were to be created, Tableau wouldn’t only be known as a business tool – it would be known as a problem solving tool with a much larger user base than it currently has. Tableau would become the de-facto tool of choice for producing graphical results.

Figure 3 - Could this be a future product?

Figure 6 – Could this be a future product?


 

If you think of MS Excel, it initially was created to do financial type calculations and it primarily was thought of as a business tool. By the mid 1980’s, Microsoft saw the dominance that Lotus 123 and Quattro had in the spreadsheet marketplace and they decided that they could build a better product. I remember the day that I decided to switch from Quattro to MS Excel because it was bursting onto the scene with better graphics and overall capabilities.  My friend in graduate school was using Excel on a Mac and I was really impressed with what he could do. So when the Windows version came out in 1987 or so, I resisted the change at first but then I realized it was time to switch. As I look back on that decision, it was the right decision. Now some 25 years later, Excel is a standard tool used for everything, in every business and every discipline in a company.

When I first saw Tableau in 2008, I immediately recognized its abilities and started using it 100% for the production of graphics and 80% for computations.  Now it is nearly 100% for both graphics and computations because it is painful for me to do graphical operations in Excel.  Sure, I still use Excel for a few things, but Tableau has supplanted Excel as my primary analysis platform. Tableau could be like that if Tableau management could expand their vision of its future. The details of how this could happen are stored in my brain and one day I’ll have the energy and determination to write that post, but for now, thanks for reading.

9 thoughts on “Reflections of the Past and a Vision of a #Tableau Future

  1. Ken – I hope Tableau Software is listening!
    My path in the engineering consulting world is similar to yours. I used Visual Modflow to make groundwater flow models. I experimented with presenting model results in ArcGIS or, at the very list massaging Visual Modflow’s output in CorelDraw or AutoCAD for final presentation in client reports. I also did a lot of work in Golden Software’s Surfer and Grapher.
    When I discovered Tableau a couple of years ago, my first thought was “This is great, but it’s a business analysis tool, how can I fit it into my workflow?”. I looked for examples of Tableau use in engineering and found none. In the last year I developed my own Tableau solutions to aid in my work on environmental science projects but I developed more dashboards for the business operations team.
    With Tableau’s current success, there may be little interest on their part in expanding their service offer. But, as the market becomes more competitive, adding functionality to tap into scientific/engineering visualization market may become a welcome opportunity for them.
    I’d like to call on Tableau to draw from experience of science/engineering users like Ken and create a task force to research the possibilities!!

    • Hi George,

      The one thing that Tableau does better than most software companies is to listen to their user base. Of course they will not discuss their creative plans as those are strategically important to their future success, but I’m hoping that by showing a few examples of what is already possible, someone at Tableau will recognize this possibility.

      I might have a keen perspective on this topic because of my history. In 1988, I wrote a master’s thesis titled, “A Microcomputer Groundwater Data Analysis Program.” This thesis was written in Pascal and had some really advanced graphics (joke). I promise to write a blog post about it when I find my 400 page printed copy down in the basement. You will laugh when you see the graphics and what I was able to do back then. I literally had to draw my graphics by mapping out pixel by pixel, lines, circles, and text. Back then, a high resolution monitor featured a whopping 640 by 200 pixels, or something close to that and you were lucky if it was a color monitor!

      Based on the title of my thesis, you can see that I have been doing data and graphical analysis for quite a while. I also used Golden’s Surfer and Grapher for years. I used Autocad for so many years (1987 – 2010) that I still think in terms of those keyboard shortcuts and how that interface would be awesome with Tableau Scientific. Back then, we had to use Autocad to do the types of things we wanted to do but it required writing LSP routines and a lot of customizations to get to where we wanted to be. For these reasons, I have seen the emergence of tool after tool for this type of work. Tableau is simply the best tool there is because it was built from the ground up for fast analytics with an emphasis placed on intuitive design. Its ability to make data connections on a desktop or in the cloud is a definite discriminator and separates it from many other products. However, as with any product there is always room for improvement and expansion and I hope that the message gets through to them.

      Last year in an interview during TCC13, Tableau CEO Christian Chabot said that his biggest problem is finding talent to fuel Tableau’s development team so that they can accomplish even greater things in the future. From my perspective, this shouldn’t be a problem for him at all. I can name a minimum of 10 to 20 people from the Tableau community that have demonstrated the requisite knowledge, passion, and creativity to help him grow the company. He should hire them by paying them what they are worth and form a special division of these people and put them into a think-tank and allow them to innovate. If he did that, the game would be over and Tableau would emerge as the Champion. If you think I’m crazy, then you should learn more information on how Google is now thinking about the need for these type of people by listening to this podcast: https://wordpress.com/read/post/feed/25134/504235798/

      Ken

  2. Tableau created a wonderful culture around their product and the engagement of Tableau users is second to none. This results in tons of innovation, feedback, insight and promotion that Tableau receives from really smart people without ever putting them on their payroll. I can only assume that this enthusiasm and energy is duplicated in their corporate culture. So why is there a problem finding talent to join Tableau’s research and development? I guess majority of these really smart people who contribute to the Tableau user community are happy where they are in their careers and satisfied with their part-time relationship with the Tableau corporation. Maybe there is not enough talent out there to fuel such a rapid corporate growth. Or perhaps Tableau’s R&D is not defined and visible enough to the outside world to lure more great talent.

    • These are all good points George. Compensation could be an issue, as well as Tableau losing their talent to competitors, as has been known to happen in computing sciences. If Chabot was worried about finding talent last year, what is he going to do over the next two years when he has promised to spend more money on R&D in the next two years than in all years previously in Tableau history? Certainly, Tableau benefits from its giving community. There are a lot of people that give ideas and promote the brand with great enthusiasm. This will take Tableau only so far, however.

      To continue its stellar growth and superb improvement in its software, Tableau would be wise to follow the leadership of Google. What Google has figured out is that they now prefer to hire generalists, rather than specialists, because the generalist isn’t married to particular solution techniques. Generalists are more apt to be creative thinkers that are capable of innovation. Although specialists are great at what they do, they tend to be limited in their breadth of knowledge and tend to be threatened when they are asked to move outside their domain of interest. Innovation is key to driving Tableau’s future.

      In a fast-moving business world where technologies are being developed at an amazing pace, companies are better served with generalists on their team rather than specialists. If companies hire a group of specialists to solve the problem du-jour, once that problem is done, you have a staff that is less flexible over the long term.The key to finding these type of creative people is to look outside your current industry to find people that are passionate, have talent, and are adaptable to the job at hand. Many of these people think they are stuck in their current job situation but are fully capable of providing the horsepower needed to overcome the challenges faced at a place like Tableau. The problem for these people is that nobody is willing to give them a chance to show what they can do.

      If Tableau’s management is as good as I think they are, they will realize this lesson sooner than later and follow Google’s lead. The biggest mistake Tableau can make is to think that just because they have developed a great product and that they are the subject experts with that technology, that no one else can help them. That type of approach has caused many companies to either lose their edge in the marketplace or to cause a complete collapse in the company. Momentum is a hard thing to create and an even harder thing to maintain. Right now Tableau has the momentum but that there is no guarantee that it will remain with them over the long term.

      An infusion of multi-talented people into an existing team always leads to a better solution than keeping the team limited with like-thinkers. What the Tableau talent acquisition team should do to help the Tableau corporation reach its goals is to identify these people, promote them to management, and then management should call these people with an offer that they can’t refuse. Chabot just needs to say to these people: “You are going to come to work for Tableau. When can you start?” It isn’t as hard as C. Chabot thinks that it is. “That is all I have to say about that”, to quote Forrest, Forrest Gump.

  3. Pingback: Using Mathematical Modeling And #Tableau To Predict The Flu | 3danim8's Blog

  4. Hello Ken, I’ve just come across your blog and found it very interesting. Our experiences have some parallels. My first experience in computer-based data analysis was with COBOL and punch cards analyzing bird banding data in the 70s. In 1985 I started using FOCUS, the leading 4GL reporting tool/product, and in ’86 I went to work for IBI, FOCUS’ vendor where I spend several years as a FOCUS consultant before moving into product management where I, among others, advocated for the adoption of then-modern approaches to using computers as we moved away from the mainframe model to other, human-oriented ones.

    At that time IBI was in a position similar to Tableau’s position today. They had the best product ever created for doing straightforward business data analysis, it was pretty much the linear descendant of the creative spark of a single person (ORP), and it was already stagnating. FOCUS didn’t more forward and lost its position as the best tool for data analysis by nontechnical people.

    Tableau is hamstrung by it’s basic paradigm of what it does and how it does it. Looking back at the information on Polaris, the Stanford project that was Tableau’s incubation (http://www.graphics.stanford.edu/projects/polaris/#pubs), there’s this: “The Polaris interface is simple and expressive because it is built on top of a formalism for describing table-based graphical representations of relational databases.”

    Briefly, Tableau is wedded to the original problem domain and design principles it was created with. Like many successful products these served it well when it was young and simple but they inhibit it’s evolution into broader spaces. This discussion goes broad and deep, but here are a couple of examples:

    Tableau is (almost utterly) data-driven – everything the product does results from the configuration of data elements in the UI. If you want Tableau to behave a particular way you need to figure out how to construct and configure the data so that Tableau responds to it and does what you want.

    There’s no ability to express behavior algorithmically to control how Tableau behaves. This is a consequence of the previous point, but it bears calling out on its own. This leads to entire rich ecosystems of increasingly complicated hacks that people construct to get Tableau to behave in the desired way. I’ve been reading this morning a number of very impressive articles on selectively hiding and displaying dashboard components, all driven by the use of pseudo-data carrying worksheets that are displayed or not according to whether they are selectively filtered to have contents or not.

    Tableau only understands single-grained flat record sets. Whether these are single real tables or the result set of joins or blending Tableau cannot recognize even a simple two-level hierarchy and respond appropriately to simple analytical operations, e.g. sum budget, salaries, and business expenses by department by salesperson. This is particularly irksome since this capability was part of the business data analytical tools created in the ’60s and ’70s (Ramis, Nomad, FOCUS, etc.) and its absence renders whole classes of analyses impossible within Tableau’s direct-action/result operational design.

    Tableau’s presentation space is rigid and inflexible, composed of a matrix of identically-sized cells formed by the nested, sorted organization of dimension(s)’ members arranged to the left, above, and below of the cell matrix. This presentation space artificially limits the useful analytical presentations one could construct.

    Tableau is also deficient in explaining how its product works. Most of what the Tableau community knows about how Tableau really works and how to take advantage of it to achieve useful results comes from people in the community who have spent enormous amounts of time fiddling, ferreting, probing, experimenting and puzzling out how it works and teaching the rest of us. Joe Mako led the way in the early days-I’ve been using Tableau since 2007, and his lead has been followed by a number of people who have continued to illuminate how Tableau works and how to take advantage of this knowledge to achieve useful results.

    The fundamental problem here is that as people dig deep and figure out how Tableau works, turn that into generative information to show how to do something useful, no matter how byzantine or complex, Tableau becomes increasingly fixed to the way it is, reinforcing rather than rethinking its fundamental approach to what it’s nominally trying to help people do: access, analyze, and understand their data.

    To your and George’s points above about Tableau recruitment (or not) of talented people to help them move ahead. the idea that Tableau is getting huge value from the information coming in from the community through their forums, and the Ideas forum is notable here, holds water. There’s no end to the suggestions that come from the community. But this is not an unalloyed good thing.

    Tableau is suffering from the organizational equivalence of groupthink. The communities ideas, and I believe Tableau really does pay attention, are vastly oriented around cosmetic and superficial concerns that reinforce the way Tableau is rather than deeper considerations about how to fundamentally improve the product. As people figure out ways to twist it into shape and make it dance new steps they’re reinforcing its nature even as they’re making it more true that achieving good useful analytical results requires deep mastery, ironically the antithesis of Tableau’s basic and great value: that achieving these is simple and straightforward.

    As to why doesn’t Tableau hire good talented people? I think they do. I know many people at Tableau and they are all bright, accomplished, helpful, motivated and interested in doing good thing. Tableau has hired a lot of people from Microsoft over the years, apply whatever your opinion is of MS’s innovation record here. I also know people who Tableau has declined to hire, people who one would think are those very people you describe in your Sept 30 post who could help Tableau evolve and become a better product, not just a more valuable company.

    Is Tableau the victim of the inventor’s dilemma? It smells like it.

    Tableau is a spectacularly good product in the space is was created for, there’s not much room between it and as good as any product could be. But that is a fairly small space and Tableau shoehorns additional functionality into the same paradigm that it started out with, even though it’s not suitable for the larger world.

    When Tableau showed up it was a revelation. After many years of waiting there was finally tool that made basic business data analysis simple, easy, and straightforward. I happily paid for a license, started using it in my work and haven’t regretted it.

    And now I’m wondering what the next great tool will be. Tableau shows no evidence that it’s moving to address its fundamental problems, and is in fact responding to the forces that extol its complexities as virtues.

    There’s an aggressive evolution happening in data visualization, technologies are emerging that make it relatively straightforward to create spectacularly good visualizations, from the simplest to the most sophisticated. D3.js is the best known.

    The real question is: who’s going to come out with the tool that presents a clean, simple, modern interface in front of the modern visualization technologies and be the next disruptive product?

    • Hi Chris,

      Your comment is obviously based on a lot of experience using Tableau and other precursor software packages. I have now read what you wrote three times. Each time through, I have been able to appreciate more of what you said and see the connections to things I have written. This is truly the deepest comment I have ever received on my blog. Thank you.

      Ken

  5. Pingback: Impressions From My First Day Of Using #PowerBI | 3danim8's Blog

  6. Pingback: How To Achieve Better Data Comprehension, Part 2 | 3danim8's Blog

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.