Evolution of Analytics:: Ramaraju Poosapati Business Analytics For Year 2024
Evolution of Analytics:: Ramaraju Poosapati Business Analytics For Year 2024
Evolution of Analytics:
Introduction to Business Analytics
Business analytics refers to the practice of using data analysis, statistical techniques, and Software
tools to improve business decision-making. It aims to enhance operational efficiency, identify market
trends, optimize various business processes, and support strategic decisions through data-driven
insights.
Evolution Timeline
During the 1970s, Decision Support Systems (DSS) emerged as computerized systems designed to
assist in complex decision-making. These systems comprised three primary components: database
management, model management, and user interfaces.
Database management facilitated the storage and retrieval of relevant data, while model
management involved the application of analytical models to the data. The user interface allowed
users to interact with the system, retrieve information, and analyze data. Technologies of this era
included mainframes and minicomputers, which enabled the processing and storage of data,
alongside query languages like SQL for database interactions. DSS were utilized in areas such as
inventory control, scheduling, and resource allocation.
The 1980s saw the advent of Executive Information Systems (EIS), which were designed to provide
top executives with easy access to critical data. EIS featured user-friendly interfaces, typically in the
form of dashboards and graphical displays. These systems offered drill-down capabilities, allowing
users to explore data hierarchically, and trend analysis for identifying and visualizing business trends.
Technologies in this period included graphical user interfaces (GUIs) and early forms of data
visualization, which made data analysis more intuitive for executives. EIS were employed in strategic
planning, financial analysis, and competitive intelligence to aid in making informed strategic decisions.
Data warehousing involved the centralized storage of large volumes of data, while OLAP tools
facilitated multidimensional data analysis. Data mining utilized statistical and analytical tools to
extract patterns and knowledge from large datasets. Technologies of this period included ETL (Extract,
Transform, Load) processes for integrating data, relational databases, and data marts. BI was applied
in performance management, market analysis, and customer insights to drive business strategies.
2000-2010: Analytics
The early 2000s introduced advanced analytics, focusing on deriving actionable insights using
sophisticated statistical techniques. Analytics evolved into three main categories: descriptive,
predictive, and prescriptive analytics.
Descriptive analytics aimed to understand past performance by analyzing historical data to determine
what happened. Predictive analytics used statistical algorithms and machine learning to forecast
future events and trends, addressing questions like what will happen. Prescriptive analytics went a
step further by recommending actions based on predictions, suggesting what should be done to
achieve desired outcomes.
Technologies during this period included data mining algorithms, statistical software such as SAS and
SPSS, and advanced Excel functions. These tools supported applications in marketing campaigns, risk
management, and operational efficiency by providing deeper insights and enabling better decision-
making.
The 2010s were characterized by the emergence of Big Data, involving the management and analysis
of vast and complex datasets. Big Data introduced the concept of the four Vs: volume, velocity,
variety, and veracity.
Volume referred to the large amounts of data generated, velocity to the speed at which data was
produced and processed, variety to the diverse types of data (structured and unstructured), and
veracity to the quality and reliability of the data. Key technologies in this era included the Hadoop
ecosystem (with components like HDFS and MapReduce), NoSQL databases such as MongoDB, and
cloud computing platforms like AWS and Azure, which provided scalable solutions for data storage
and processing.
Big Data applications spanned social media analysis, Internet of Things (IoT), and real-time analytics,
allowing businesses to uncover hidden patterns and insights from massive datasets.
The 2020s have seen a shift towards AI-based analytics, leveraging artificial intelligence to automate
and enhance data-driven decision-making processes. This approach integrates machine learning, deep
learning, natural language processing (NLP), and reinforcement learning to provide more
sophisticated analytics solutions.
Data mining focuses on understanding characteristics and patterns among variables in large
databases using a combination of statistical and analytical tools. It employs both standard statistical
tools and more advanced ones to extract useful information from data. Simulation and risk analysis
use spreadsheet models and statistical analysis to examine the impacts of uncertainty on various
estimates and their potential interactions, enabling businesses to explore different scenarios and
assess risks effectively.
Spreadsheets and formal models facilitate what-if analysis, allowing manipulation of data to perform
scenario testing. This type of analysis helps businesses understand how specific combinations of
inputs, reflecting key assumptions, will affect model outputs. Additionally, what-if analysis is used to
assess the sensitivity of optimization models to changes in data inputs, providing better insights for
making informed decisions.
1. Descriptive Analysis
Definition: Descriptive analysis involves examining historical data to understand and describe
past events. It answers the question, “What has happened?” by summarizing data and
identifying patterns or trends. It uses techniques like Statistical Summaries, Data Aggregation
and Data Visualization.
Key Features:
• Data Summarization: Descriptive analysis summarizes large volumes of data into
understandable formats using measures like mean, median, mode, and standard
deviation.
• Trend Identification: It identifies patterns and trends over time, such as sales trends,
customer behavior patterns, or market performance.
• Visualization: Data visualization tools such as charts, graphs, and dashboards are
crucial in descriptive analysis for presenting data insights in a comprehensible
manner.
2. Predictive Analysis
Definition: Predictive analysis uses statistical models and algorithms to analyze historical data
and make predictions about future events. It answers the question, “What is likely to
happen?”. It uses techniques like Regression Analysis and Time Series Analysis.
Key Features:
• Trend Forecasting: Predictive analysis forecasts trends by analysing past data patterns
to anticipate future outcomes.
• Risk Assessment: It assesses potential risks and opportunities by predicting
probabilities of various outcomes.
• Scenario Planning: Helps in creating and analysing different future scenarios based on
predictive models.
3. Prescriptive Analysis
Definition: Prescriptive analysis provides recommendations for actions to achieve desired
outcomes. It answers the question, “What should we do?” by suggesting the best course of
action based on data insights and predictions. It uses techniques like Optimization models,
Simulation and Heuristic Algorithms.
Key Features:
Ordinal Data
Ordinal data can be ranked or ordered according to some relationship but does not provide
specific numerical differences between ranks.
• Examples:
o Rankings: College sports teams ranked based on performance.
o Survey Scales: Service ratings such as poor, average, good, very good,
excellent.
• Properties: Ordinal data allows for ranking and comparison but lacks fixed units of
measurement, making numerical differences between ranks less meaningful.
Ratio Data
Ratio data is continuous and has a natural zero, making both differences and ratios
meaningful.
• Examples:
o Dollars: Financial figures like sales revenue or expenses.
o Time: Duration of processes or events.
• Properties: Ratio data is the most informative type of measurement, allowing for
meaningful computation of averages, ranges, and ratios. It includes all the
information of the other data types and can be converted to them.