Along with a small team from Goodwin, I (pictured at far right in photo) attended Tableau’s massive “Data16” conference in Austin, Texas from November 7 through 11, 2016. The conference focused on the broad uses and deep technical functions related to only the Tableau data visualization program, which I have written about here.
Like other tech conferences, this one has many different benefits for attendees, including obtaining hands-on training, meeting peers facing similar issues, hearing about product innovations and developments, and learning strategy and adoption techniques from other organizations that have implemented a data-driven culture. And make no mistake, legal KMers, there is such a thing as a data culture, one that values insights and knowledge gained from data but that also enjoys a good muddy wrestle with a data cube, relational database, or clustering algorithm.
I went to build my hands-on skills developing Tableau dashboards and visualizations for my firm and to better understand how to enhance data-informed visualization-based decision-making.
From my perspective, key events were the product keynote, the Iron Viz competition, and the many training sessions.
The event hashtag was #data16.
One of the most important events of the week was the main keynote Tuesday morning, attended by all people at the conference as well as 10,000 online. It was conducted by many different Tableau company product managers and executives.
The keynote saw multiple product development announcements, covered by Gartner and Interworks. Typically the announcements referred to future releases; the expectation from the community appears to be that the announced improvements will be in place, roughly speaking, before the time of the next conference, although formally the announcements pertain to developments in the next few years. The past has seen some grumbling about product announcements not met with actual releases. Regardless, Tableau is a very successful and innovative analytics and visualization platform that is a “Gartner quadrant leader” and also has a large user community that continually suggests improvements and helps each other. For instance, two of the Tableau resources I find myself returning to again and again were not published by the company, but address “string calculations” and “case statements”.
Probably the most intriguing developments from a knowledge management perspective was an announcement concerning the introduction of natural language querying and machine learning for database discovery. With natural language querying, instead of needing to manually manipulate filters or click into visualizations, users will use natural language to adjust a dashboard’s views in Siri-like fashion by, for instance, asking a precipitation database “where the highest rainfalls in July 2015 were.”
Another feature will also be machine learning that will aid identification of databases related to the users’ current visualization. This feature will apparently leverage both popularity of other databases and the other databases’ similarity to the current one. Similar suggestion algorithms can be found in any decent shopping website, and also in Westlaw/Lexis research tools.
Other very impressive, if more technical, developments promise to greatly increase the speed of ingestion and analysis of large data sets (“Hyper Data Engine”), and aid the extraction, transformation, and loading (“ETL”) of data before it gets into Tableau (“Project Maestro.”) These developments show that the company is seeking to extend its capabilities beyond the visualization and analysis of data, and into the ETL that is essential for effective use of data within businesses.
The Iron Viz competition is a crowd favorite, not to mention a truly impressive spectacle, at least for those who have tried their hand at creating effective data visualizations.
Three teams, consisting of a visualization creator paired with a Tableau company “assistant,” receive the same dataset one day before the competition. In front of a live audience who can see every move projected on big screens, the teams start from a blank workbook and create one or a small series of visualizations of the data in just 20 minutes. A panel of judges (Tableau executives and consultants) and the audience (through twitter) then vote.
This year the dataset consisted of 150 million rows of data about New York City taxi rides. The wrinkle was that contestants were permitted to pull in public external data if they so chose.
Without going into any gory details, the winning visualization was Curtis Harris’ Yellow Taxi, which highlighted the long hours and constant work of New York cab drivers. While all contestants were of course Jedi-level visualization developers – and I do not take lightly their level of learning and years spent developing the skill level required to do what they did – the competition highlights one of the key benefits of Tableau: it is designed to allow users without programming or technical skills to quickly develop visualizations, learn from complex data sets, and effectively communicate their insights. In other words, it is a platform for knowledge acquisition and transmission.
The sessions were generally quite packed. To give some sense of the scale here, each hands-on training room had rows of tables with laptops, at least 150 and often as many as 300 per room. Finding your level within the sessions wasn’t entirely straightforward; I went to a few labeled “advanced” that I didn’t get much from (the levels are Basic, Intermediate, Advanced, and Jedi). I learned this newbie tip from colleagues who had been before: it is very important to register for any hands-on sessions through the mobile app the week before the conference. There were typically lines of people waiting to get in on “stand-by” outside the hands-on sessions.
I went to sessions on calculations, user-centered design, building a Tableau community at Staples, and leveraging data visualizations across a large mid-Atlantic health care company.
No sessions addressed legal risk or legal visualizations specifically. An intriguing session from Grant Thornton, however, suggested that large accounting firms may leverage data analysis to create systems that highlight risky transactions. One example involved the Foreign Corrupt Practices Act, the idea being that large transactions involving the children of government officials or labeled a “facilitation payment” should be looked into by clients’ risk management teams. This type of service requires extensive access to client data.
Aphorisms and Takeaways
- The joy of discovery is key to science and to data visualization—Bill Nye.
- To convince others to change you have to go beyond data to speak the others’ language.
- A great viz will, however, inspire conversation.
- Don’t try to cram it all into a single viz; you won’t get sustained engagement that way. (Brian Stephan)
- Data visualization and data analytics are making a difference with climate change, supply chain optimization, emergency response, education, medicine, drug development, and more.
- Applications of data visualization and data analytics are continuing to grow and add value to many organizations, leading to significant investments in data and data analytics people, processes and technology.