Everyone experiences rites of passage into adulthood. Sometimes it’s marriage, having a child, or just graduating from college where we mentally morph from idealistic, passionate, and carefree teenagers into responsible adults. As a long-time observer and some would say that stern 1st grade teacher of Tableau since its early childhood, it is clear to me that Tableau has now put on its big girl pants – and Tableau 2016, its annual user conference, with more than 13,000 attendees was Tableau’s coming of age party.
While the classic user enthusiasm was still there with fan favs, such as ‘Devs on Stage’ where the developers themselves (notably 4 out of 5 of them females) introduced new features to a rousing crowd, the tone and energy level seemed a bit subdued compared to previous years. Perhaps because 20% of attendees self-identified as either in IT or Data and Analytics leadership roles – and, admittedly,, we are a more boring bunch. More likely, though, it’s because Tableau deployments are becoming serious mission critical business for many Tableau customers. Tableau also has a new, professional CEO, Adam Selipsky formerly number two at Amazon AWS, who kicked off the event. It was clear that Adam has a serious mission to lead Tableau into the post $1 Billion in revenue realm of great companies. His plans are to leverage Tableau’s large installed base to:
- Expand beyond the business user to focus on enterprise deployments and IT buyers, which is critical as the market has matured and customers are now demanding that Tableau support larger and more complex deployments.
- Deliver new products in its portfolio beyond its core desktop and server product and cloud offering, Tableau Online to include an expansion of Cloud options and a new standalone self-service data preparation solution, code named Project Maestro. Others are likely to come. This will be critical to satisfy demanding Wall Street investors looking for growth and profitability from an adult Tableau.
- Expand partnerships and build vertical expertise – this too is about market scale and leverage.
We also learned a lot about Tableau’s roadmap for the next 12 to 24 months in the main keynote from fan favorites, Christian Chabot, founder and now Chairman of the Board, Francois Ajenstat, Chief Product Officer, Andrew Beers, Chief Development Officer and Dan Jewett, VP of Product Management.
The vision revealed in the keynote had an unmistakable emphasis on the enterprise, but as the market for visual-based data discovery is mainstreaming, Tableau also laid a foundation for building a next generation, ‘smart’ BIA platform.
What’s on Deck for 2017?
- Tableau for Linux!! This includes support for LDAP.
- The ability to create a data catalog of certified data sources, promote user modified sources, build individual and group sandboxes and do visual impact analysis – in support of agile trust and governance. Machine learning-enabled smart recommendations will surface data sets and ultimately content based on user and user cohort group behavior. This is will be part of the confusingly named Data Server– which is a feature of –not separate from – and not found by name in – Tableau Server. Can we just call this Tableau’s agile data cataloging capability? OK, set aside the unfortunate naming, this has the potential to be differentiated functionality that will help govern Tableau deployments and put the right data in the hands of users faster so they can get the most value.
- A new in memory data engine based on Tableau’s acquired Hyper technology to speed data ingestion and analysis of large complex data sets. This new engine will replace the .TDE by end of 2017. Tableau assures that this will be a seamless migration for users, and they will continue to support both file formats for the foreseeable future. Time will tell.
- A new standalone self-service data preparation product based on Hyper for visually harmonizing, cleaning and transforming complex and large data sets. This will reduce the need for 3rd party self-service data preparation tools that have been necessary in many large and data complex Tableau deployments – adding to TCO. This product in combination with the new agile cataloging capability (I just can’t bring myself to call it Data Server) is potentially a strong differentiator. Pricing and packaging are still a work in process.
- It was clear that Tableau intends to double down on Cloud. The hiring of the number two person at Amazon AWS was a clue, but also many of the detailed announcements and plans focused on making it easier to deploy Tableau in any cloud configuration – public, private, hosted on any provider or hybrid. Nearer-term in 2017, Tableau’s Live Query Agent features will be the most important. It will enable access to on premises data from the cloud. This is an important competitive improvement for Tableau versus other hybrid data cloud offerings, such as Microsoft Power BI, as hybrid data access is a necessity for organizations transitioning to the cloud, but still have critical data assets on premises.
- Equivalent authoring capability in Web as in desktop. This has both deployment and potential licensing implications as most users will no longer need both a desktop and server license for enterprise deployments. Stay tuned for how Tableau will handle this shift, particularly on the licensing front.
- A continued evolution of Tableau’s APIs for easier embeddability and extensibility.
Core Platform Enhancements
- As more evidence that the platform is maturing, Tableau will add discussion threads with point-in-time visualization snap shots to give collaborators context in the discussion. Users will also be able to define and monitor metrics and alerts to changes based on defined or dynamic thresholds. This plus other announced features like geospatial layering are competitive catch ups, but none-the-less important platform evolutions.
Next Gen BI&A
- Tableau will complement its core visual exploration approach with Natural Language Processing –based search giving users new ways to interact with data. Tableau’s approach is to support complex questions and compound queries, which have been a challenge to date for many search-based data discovery tools.
- Tableau will also add the beginnings of machine-learning based automated insights called ‘Instant Analytics’ that will present users key insights in data and in conjunction with the new agile data cataloging features in Data Server (there I said it), give recommendations for data sources and workbooks and ultimately metrics that are based on a user’s context, usage, their peers’ usage. Gartner has defined these as next generation ‘smart’ capabilities that will likely represent core, mainstream features of BIA platforms in 3 to 5 years, much like visual-based data discovery has become main stream today.
Visual and Search-based data discovery have largely been separate product segments (with search-based exploration adoption lagging behind visual-based), as have Graph-based data discovery and Hadoop-based data discovery tools (like Datameer). Gartner predicted back in 2015 that:
- By 2018, smart, governed, Hadoop-based, search-based and visual-based data discovery will converge in a single form of next-generation data discovery that will include self-service data preparation and natural-language generation.
While Gartner may be off by two or so years, this next wave of disruption has begun. It’s happening. And like the visual-based exploration wave before it, those vendors not on board will be marginalized. With the addition of Hyper, Project Maestro, NLP-based exploration, instant insights and recommendations to its roadmap, Tableau is positioning itself to ride this next wave – at enterprise scale and with trust and governance. Only execution remains, which is non-trivial in an increasingly crowded market.
Bye for now!
View Free, Relevant Gartner Research
Gartner's research helps you cut through the complexity and deliver the knowledge you need to make the right decisions quickly, and with confidence.Read Free Gartner Research
Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.