Petya Petrova - Fotolia
AUSTIN -- If there's one weakness users see in Tableau's data visualization tool it's that it has been slow to incorporate machine learning and other advanced analytics capabilities that are reshaping the analytics tools market.
The company may be taking steps to remedy this impression.
At Tableau Conference 2016 here, Tableau executives discussed a number of changes to the software due next year that leverage machine learning both to make the software easier to use and to deepen its functionality.
First, Tableau is planning to add a natural language processing feature that will allow users to explore their data using free text. For example, a user looking at a map with data points plotted out geographically can type into a search box to show only data points in a certain neighborhood. The NLP engine automatically interprets the user's query without requiring him to formally call out specific data elements.
Second, the Tableau data visualization software will add a feature similar to a recommendation engine to help users build visualizations automatically. Users will be able to highlight segments of data and the engine will recommend charts calling out potentially significant elements like characteristics the data points share in common or diverge on.
In an interview before the conference, David Menninger, an analyst at Ventana Research, said this kind of functionality is becoming highly demanded by businesses and adopting it will play a central role in the ability of visualization companies like Tableau to stay relevant in today's analytics market.
He said that as more businesses master basic BI the type of analytics that will deliver the most benefit is going to become more complicated. This will involve identifying potentially slight correlations in large data sets that can't necessarily be spotted visually.
"If all you're doing is visualization you have to have pre-knowledge of your data to spot patterns," he said. "The only way you can effectively work with big data is more advanced analytics techniques."
Tableau to incorporate new data engine
Tableau executives also provide a peak under the hood at a new data engine the company is working to deliver next year. This past May the company acquired HyPer, an in-memory database company. At the time executives said it would add staff and technology from HyPer as part of a research and development center, but now that technology is coming to the primary data visualization product.
Executives said putting the in-memory database engine at the center of its software will speed up data ingestion and make iterative development of visualizations on live data more real time.
Jason Flittner, a senior analytics engineer in the content data engineering and analytics team at Netflix, said he thinks the new data engine will enable him and his team to pull more data into Tableau than is currently possible. This will make it possible to do more interactive, visual data analysis and could eliminate some of the need for coding in more data science-heavy tools, Flittner added.
"I'm excited to see what we can do with it," he said. "I'm looking forward to getting our hands on it."
The company is planning to roll out all the new features announced at the conference into the Tableau data visualization tool over the course of 2017.
Data visualization tools help users clean up messy data
Self-service and data visualization remake traditional BI
Data visualization tools play big role in big data