Why You Need an Algorithm - Not a Data Scientist
Mark Twain once said: “The past does not repeat itself, but it rhymes.” Although future events have unique circumstances, they usually follow familiar past patterns. Today, data scientists can predict everything from disease outbreaks to mortality to riots.
It should come as no surprise, then, that companies trying to hear the rhymes and see the patterns in their sales conversions are attempting to manually analyze their own data, enlist the best data scientists, and train their managers to be more quantitative.
This people-centric, high-touch approach, however, is not scalable. Markets are too dynamic and some of the changes too imperceptible, to be realistically predicted by humans.
Consider a company that is selling electronic devices. Let’s say that historically they have been selling successfully to companies that value their quick delivery and the quality of their product. With the passage of time, the competition grows and a global trend for green products arises. The profile of the company’s perfect customer slowly shifts and could go unnoticed by manually examining the market. Those small shifts, however, are identifiable by algorithms that continuously monitor the historical sales cycle of the company, cross-referencing it with external sources, like social media posts and newspaper articles discussing these trends, and finding correlations with the propensity to buy. Due to the size of this information base and its unstructured nature, monitoring all those delicate changes in real-time becomes an almost impossible task for a human analyst (and even were it possible, imagine the vast resources – time, for example - consumed in achieving such tasks).
While few companies have the luxury of having data scientists with the expertise needed to develop these sophisticated algorithms, and even fewer have staff qualified to analyze the results effectively, there is less need today. Data science today requires fewer experts, as many more automated tools are being developed and used to analyze thousands of events. The more sophisticated tools require very little or no human intervention, zero integration time, and almost no need for service to re-tune the predictive model as dynamics change.
Today, automated algorithms can identify patterns and provide insights such as:
Did you notice a big portion of your customer churn is from companies that have not used one specific feature of your product in the last three months?
Did you notice that the leads that converted to closed deals this month were from medium size, high-growth companies who were searching for keywords comparing your product to the products of your competitors?
But as your business changes, the answers will change as well, requiring more and more automation to track those changes and supply the business leader with real-time, actionable recommendations that are always relevant, and may be costly to ignore.
In the next few years, I believe many businesses, especially B2B, will use prediction in their business. But those who get the most from these analytics will be those that use automated algorithms – which are faster, more accurate, more scalable, and more adaptive than manually analyzed data.
In stock trading, human analysts once did the trading. Today, more and more automated machine learning algorithms accompany their decisions. It has become much harder to compete without such algorithms. Similarly, in the next few years, very few businesses can afford not to have automated decision-making systems mining their data and suggesting the best next actions – not only in Operations, but in the Marketing, Sales, and Customer Success departments too. Following a large amount of ever-changing information will be the competitive edge.
Kira Radinsky, Ph.D. is the chief scientist and the director of data science of eBay, co-founded SalesPredict (acquired by eBay in 2016), and serves as a visiting professor at the Technion, Israel’s leading science and technology institute.