Ivelin Radkov - Fotolia

Manage Learn to apply best practices and optimize your operations.

The analytics team shouldn't always pursue quick wins

Analytics teams typically look for quick wins to prove their value. But experts say this approach should not become the guiding strategy for data product development.

A common piece of advice says that an analytics team -- particularly a new one that's eager to prove its value -- should aim for quick wins. By delivering a data product or report quickly, the team can get executives and business partners on board and lay the groundwork for future support, even if the first deliverable isn't exactly groundbreaking.

But not everybody agrees with this advice.

"Quick wins are not always long-term wins," said Ahmad Anvari, head of Messenger business and platform analytics at Facebook, in a presentation at the recent Big Data Innovation Summit in Boston. "I've seen this many times in my career."

Think before you report

For Anvari, focusing too much on short-term projects can distract from long-term strategy and cause teams to miss big opportunities. It's something akin to an investor making stock trades every day to maximize daily returns without thinking about retirement planning.

For example, Anvari said when Instagram, which is owned by Facebook, initially launched in-app advertising about two years ago, there was a lot of excitement about the first data to come out. Compared with other features in the app, users were clicking on ads at high rates. But advertisers weren't happy because users typically left the page immediately after clicking on an ad.

Eventually someone realized the number of people clicking on ads looked good because in the main photo stream users see, ads are the only content type that is clickable. Native content is not. So, many people who clicked on ads did so accidentally. The app now has a feature that asks users to confirm that they intentionally clicked on an ad before sending them to the advertiser's page, and the data is much more meaningful.

Minimally viable can be minimally valuable

The minimally viable product is another way analytics teams look for quick wins, but it can lead to its own share of problems. Dhruv Bhargava, global head of data science at gaming company Zynga Inc., said in a presentation at the conference that while it's important to encourage data scientists and product managers to get usable applications in front of users frequently and support developers' creativity, just getting something out the door shouldn't be an end unto itself.

The minimally viable product strategy is often seen as a good thing because it forces developers and analysts to think about how users will interact with a report or application. It also allows them to continue refining the product based on user feedback.

It can be problematic, though, when developers never update the product after releasing it, which Bhargava said he's seen happen. Product management teams rush to get something basic out with the intention to come back to it to refine it. But then they don't, which means users have nothing but suboptimal data applications.

"If you say we need to come back to something and improve it, do it," he said.

Don't train for the wrong game

Using the wrong data to build data products can also lead development projects astray. Steve Carter, chief scientist for online dating site eHarmony Inc., said data scientists and members of the analytics team will often use whatever data they can get their hands on at early stages of projects. Typically, this will be some kind of historical data. But Carter said if they are building a product that will be used in production -- in the case of eHarmony, the main product is a predictive engine that matches users based on profile characteristics -- the product should be trained on the same data it will use once in production.

It can be challenging for data scientists to get their hands on production data. This could have to do with governance or engineering restrictions, and it is particularly problematic for a data scientist looking for quick wins.

But Carter said a model trained on historical data will often fail when put in production and used against current data. This is why data scientists should always work in conjunction with data engineers to make sure whatever they are building will hold up. It may take longer, but the end result will be better.

"When you start envisioning a process of modeling a data product, start with the engineers," Carter said.

Next Steps

You don't necessarily need a data scientist on your analytics team

Everything you need to know about what your analytics workers need

Execs talk about their approach to hiring data scientists

Dig Deeper on Advanced analytics software

SearchDataManagement

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSQLServer

Close