Legacy analytics tools are fighting for survival

The first recorded use of business analytics was in the 19th century when the famed US mechanical engineer Frederick Winslow Taylor set up a time management system. In the 1960s, businesses started relying on computers for decision-making systems and the popularity of analytics grew. Later, the arrival of SAS – a software suite to mine, alter, manage and retrieve data from different sources and perform statistical analysis – in 1972 changed the analytics game.

Analytics tools are a critical component of businesses globally. Such tools are used to retrieve, sort and process data and present them in a simple format to enable decision making.

In the 90s, businesses had multiple analytics tools at their disposal for information management, performance management and visualization. Today, more and more businesses are migrating to the cloud for cost reasons, ease of data management and governance, and greater control and security. Also, companies such as DataRobot, Databricks and Dataiku are leveraging machine learning and open source technologies and libraries to help businesses make the best use of their data.

Cloud is the future

Before cloud computing, businesses had a hard time maintaining servers and building infrastructure to host applications. Not to mention the high-cost involved. According to a Gartner report, modernizing legacy applications could reduce IT costs by nearly 74 percent.

In 2020, SAS revamped its primary analytics and BI platform to make it cloud-native. However, the company continues to offer its SAS 9.4 to serve its on-premises customers. SAS’s decision was influenced by its competitors such as MicroStrategy, Qlik and ThoughtSpot adopting a cloud-first strategy.

According to Jay Upchurch, SAS’s executive vice president and CIO, the company reached an inflation point where it had to decide whether to continue on the path it was on or pause and rewrite.

In fact, the launch of Apache Spark in 2014 rocked SAS ‘boat. Apache Spark is a unified analytics engine for large-scale data processing. It provides high-level APIs in Scala, Java, Python, and R, and an optimized engine that supports general computation graphs for data analysis. SAS lacks big data ETL (extract, transform and loading of data) capabilities.

It also supports a rich set of higher-level tools including Spark SQL for SQL and DataFrames, pandas API on Spark for workloads, MLlib for machine learning, GraphX ​​for graph processing, and Structured Streaming for stream processing. The creators of Apache Spark also developed Databricks.

Last week, data science company RapidMiner launched a next-gen cloud platform for enterprises. RapidMiner builds a software platform for data science teams that unites data preparation, machine learning, predictive model deployment, etc. The company’s platform has been re-architected from the ground up to deploy models faster and bridge the gap between data science and business understanding. The platform helps scale the enterprise environment by connecting to common data sources, code-free deployment and more.

Tableau Software, an American interactive data visualization software company, launched ‘Tableau Online’ in 2014. Soon, it became the company’s best selling product. KNIME, a free and open-source data analytics, reporting and integration platform, launched the ‘KNIME Cloud Analytics Platform’ in 2016.

Issues with legacy analytics tools

Now legacy systems are rigid and don’t render to easy integration. The lack of flexibility can hamper the day to day operations of businesses. But take a tool like Microsoft Power BI. It operates seamlessly with Excel and text files, SQL server and other cloud-based tools. Also, cloud-native tools are far better than legacy tools when it comes to security.

Legacy tools were developed at a time when big data wasn’t a thing. These tools are not designed to deal with huge swathes of data and tend to slow down in the face of complex, unstructured data. Modern analytics tools, on the other hand, are robust and can extract actionable insights from a humongous amount of raw data.

Further, platforms such as Databricks are fast replacing the legacy tools. Databricks allows users to collaboratively create and run data analysis projects in a structured and scheduled continent with the help of open source technologies.

Databricks also eliminates data silos and allows different teams to work together on the same platform, thus standardizing the process and making it easy for each person to understand the different processes involved. Since it controls the entire workflow, it leads to better communication, speed and efficiency.

With technology evolving at a faster rate, the amount of data generated will further go up in the coming years. In such cases, for example, it is more likely that a company will choose Apache Spark instead of SAS.

Legacy systems aren’t obsolete. Yet!

Though most legacy systems are on their way out, some businesses are still relying on these systems, and some have been using them for decades.

Microsoft’s Excel is a prime example of a legacy tool still going strong. However, Microsoft has made Excel available on the cloud. In 2019, 54 percent of businesses in the US were still using Excel.

SAS, despite its limitations, was selling and providing its software analytics products and services to customers in 145 countries in 2020. Also, 90 percent of the Fortune 500 companies still use SAS.

Continuing further, switching to a new system is not easy. Take for example, the cost of bringing employees up to speed. Everyone in the organization will have to spend time learning the new tools, from the executives to the manager to the IT team.
Also, legacy tools contain years’ worth of data for many businesses. As a result, migrating to the cloud is not an easy task. However, even though modernization is costly and time consuming, businesses can’t afford to miss the boat

Leave a Reply

Your email address will not be published.