0% found this document useful (1 vote)
573 views31 pages

Celonis Report 1276

CELONIS REPORT

Uploaded by

23505a1212
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
573 views31 pages

Celonis Report 1276

CELONIS REPORT

Uploaded by

23505a1212
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 31

PROCESS MINING VIRTUAL INTERNSHIP

Internship report submitted in partial fulfillment of the requirements for


the Award of Degree of

BACHELOR OF TECHNOLOGY
In
Information Technology
by
M Bindu
Regd.No.:22501A1276
Under Supervision of

Dr. G. Reshma
internship coordinator
(Duration: Jan-Mar 2024)

DEPARTMENT OF INFORMATION TECHNOLOGY


Prasad V Potluri Siddhartha Institute of Technology
(An Autonomous Institution)
Approved by AICTE, Permanently affiliated to JNTU, Kakinada

i. . v
DEPARTMENT OF INFORMATION TECHNOLOGY
Prasad V Potluri Siddhartha Institute of Technology

(An Autonomous Institution)

CERTIFICATE

This is to certify that the PROCESS MINING VIRTUAL INTERNSHIP REPORT


submitted by M BINDU (REG NO: 22501A1276) is work done by his/her and submitted
during 2024 – 2025 academic year, in partial fulfillment of the requirements for the award
of the degree of BACHELOR OF TECHNOLOGY in Information Technology,
VIRTUAL INTERNSHIP BY AICTE EDUSKILLS.

Department Coordinator Head of the Department


Dr. V. Siva Parvathi Dr B V Subbarao

ii. . v
iii. . v
ACKNOWLEDGEMENT

First I would like to thanks AICTE EDUSKILLS FOUNDATION for giving me the
opportunity to do an internship virtually.

I also would like all the people that worked along with me, with their patience and
openness they created an enjoyable working environment.

It is indeed with a great sense of pleasure and immense sense of gratitude that I
acknowledge the help of these individuals.

I am highly indebted to Principal Dr. K. SIVAJI BABU, for the facilities provided to
accomplish this internship.

I would like to thank my Head of the Department Dr. B.V. SUBBA RAO for his
constructive criticism throughout my internship.

I would like to thank Dr. V. SIVA PARVATHI College internship coordinator


DR. G. RESHMA internship coordinator Department of IT for their support and advices to
get and complete internship in above said organization.

I am extremely great full to my department staff members and friends who helped me
in successful completion of this internship.

M BINDU
(22501A1276)

iv. . v
ABSTRACT

Celonis Process Mining is a technology platform that helps organizations visualize, analyze, and
optimize their business processes in real-time. It provides insights into how processes like order-to-
cash, purchase-to-pay, and supply chain operations are actually performing based on data extracted
from enterprise systems like ERP (e.g., SAP, Oracle) or CRM systems (e.g., Salesforce).

Key Components:

 Data Extraction: Celonis pulls raw data (logs, timestamps, events) from an organization’s
transactional systems.
 Process Discovery: It reconstructs actual processes by mapping event logs and showing
how tasks or events flow in the real world, including deviations from intended processes.
 Process Visualization: The platform creates visual representations (process maps) of
workflows, making it easy to see bottlenecks, inefficiencies, or compliance issues.
 Performance Analysis: Celonis provides detailed metrics on process performance, such as
lead times, throughput, rework rates, and compliance with business rules. It can benchmark
process performance and identify opportunities for optimization.
 Actionable Insights: Using AI and machine learning, Celonis suggests improvements or
optimizations for processes. These could be eliminating unnecessary steps, reducing wait
times, or improving resource allocation.
 Automation: Celonis can also help automate parts of the processes to increase efficiency
and reduce human intervention, integrating with Robotic Process Automation (RPA) tools
to drive automation.

Benefits of Celonis Process Mining

 Efficiency Improvements: Identify bottlenecks, inefficiencies, and repetitive tasks in real-


time.
 Cost Savings: By optimizing processes, businesses can save costs on operations, labor, and
resources.
 Compliance and Audit: Ensure processes are compliant with regulations and standards be
monitoring and auditing them constantly.
 Data-Driven Decision Making: Provides leadership with real-time insights into operations,
enabling better decision-making.

v. . v
Key parts of the report:

Under each division we further provide specific industry solutions on focused domains with
cutting edge technologies

Benefits of the Company/Institution through our report:

 Identification of Bottlenecks and Inefficiencies: The report will highlight critical points
of delay, rework, or process deviations, providing clear areas where the company can focus
its optimization efforts.
 Process Standardization: The report can reveal variations in the way processes are
executed, allowing the organization to standardize best practices and reduce inconsistencies.
 Real-Time Process Monitoring: The report might propose solutions to enhance real-time
monitoring of key business processes, allowing for quicker detection and resolution of
issues.
 Reduction in Operational Costs: By pinpointing areas where time and resources are
wasted, the report can help identify cost-saving opportunities, such as reducing rework,
cutting down on unnecessary steps, or improving procurement processes.
 Process Automation Opportunities: The analysis might reveal repetitive manual tasks that
could be automated through Robotic Process Automation (RPA), leading to significant time
and cost saving

vi. . v
TABLE OF CONTENTS
Declaration.................................................................................................................................
Certificate ..................................................................................................................................
Internship completed certificate...............................................................................................
Acknowledgement .....................................................................................................................
Abstract ......................................................................................................................................
Table of contents........................................................................................................................
List of figures..............................................................................................................................
Index............................................................................................................................................
Learning objectives/internship objectives................................................................................
Weekly overview of internship…..............................................................................................
1. Introduction...........................................................................................................................3
1.1 Introduction
1.2 History of company
1.3 Products and Services
1.4 About the course
2. Process Mining………………………………………………………………………………5
2.1 Process Mining Fundamentals
2.2 Process Mining Implementation
2.3 Data Analysis and Prepocessing
2.4 Creation of the Celonis Data Model
2.4.1 Creation of the Activity and case table
2.4.2 Creation of Data Model
2.4.3 PQL introduction and JSON import
2.5 Creation of the Celonis Dashboards
2.5.1 Standby Crew Utilization KP
2.5.2 Notification Automation Rate KPI
3. Modules ...................................................................................................................................12
3.1 Process Mining
3.2 Process Mining (cloud)
3.3 Process Mining transparency
3.4 Process Mining is the MRI for Processes
3.5 Mining Algorithms
3.6 Starting a project in mining
3.7 Industrial usage of mining
3.8 Process Mining Software’s
3.9 Software Key Functions
3.10 Process Mining Software Providers
4.Technology……………………………………………………………………………………19
4.1 App templates
4.2 Extracting and loading data
4.3 Editing data transformation
4.4 Customizing process apps
4.5 Root cause analysis
4.6 Managing access control for process apps
5. Applications...............................................................................................................................21
6. Learning outcomes……………………………………………………………………………22
7. Conclusion…………………………………………………………………………………….23
vii. . v
Learning Objectives/Internship Objectives

 Gain a foundational understanding of process mining and its applications

 Learn how to extract, clean, and analyze event log data

 Develop skills in using the Celonis platform for process visualization and analysis

 Understand key performance indicators (KPIs) for process evaluation

 Identify process inefficiencies such as bottlenecks, rework, and deviations

 Propose data-driven process improvements and optimizations

 Explore automation opportunities through process mining insights

 Build proficiency in creating dashboards, visualizations, and reports within Celonis

 Apply process mining techniques to real-world business cases and scenarios

Develop skills in presenting analytical findings and improvement recommendations

 Enhance knowledge of business process optimization and its impact on organizational

efficiency

 Understand the role of process mining in digital transformation and operational excellence

viii. . v
WEEKLY OVERVIEW OF INTERNSHIP ACTIVITIES

DATE DAY NAME OF THE TOPIC/MODULE COMPLETED


07-01-2024 SUNDAY INTRODUCTION
1st WEEK

08-01-2024 MONDAY HISTORY OF COMPANY


09-01-2024 TUESDAY ONLINE WEBINAR
10-01-2024 WEDNESDAY PRODUCTS OF COMPANY
11-01-2024 THURSDAY SERVICES OF COMPANY
12-01-2024 FRIDAY ABOUT THE COURSE

DATE DAY NAME OF THE TOPIC/MODULE COMPLETED


21-01-2024 SUNDAY PROCESS MINING
22-01-2024 MONDAY PROCESS MINING (CLOUD)
2nd WEEK

23-01-2024 TUESDAY PROCESS MINING TRANSPARENCY


24-01-2024 WEDNESDAY PROCESS MINING IS THE MRI FOR PROCESSES
25-01-2024 THURSDAY MINING ALGORITHMS
26-01-2024 FRIDAY STARTING A PROJECT IN MINING

DATE DAY NAME OF THE TOPIC/MODULE COMPLETED


04-02-2024 SUNDAY INDUSTRIAL USAGE OF MINING
05-02-2024 MONDAY PROCESS MINING SOFTWARE’S
06-02-2024 TUESDAY SOFTWARE KEY FUNCTIONS
3rd WEEK

07-02-2024 WEDNESDAY PROCESS MINING SOFTWARE PROVIDERS


08-02-2024 THURSDAY ACADEMIC PROCESS MINING FUNDAMENTALS
09-02-2024 FRIDAY ACADEMIC PROCESS MINING
FUNDAMENTALS

1
DATE DAY NAME OF THE TOPIC/MODULE COMPLETED
18-02-2024 SUNDAY APP TEMPLATES
4th WEEK

19-02-2024 MONDAY EXTRACTING AND LOADING DATA


20-02-2024 TUESDAY EDITING DATA TRANSFORMATION
21-02-2024 WEDNESDAY EDITING DATA TRANSFORMATION
22-02-2024 THURSDAY CUSTOMIZING PROCESS APPS
23-02-2024 FRIDAY CUSTOMIZING PROCESS APPS

DATE DAY NAME OF THE TOPIC/MODULE COMPLETED


5th WEEK

10-03-2024 SUNDAY ROOT CAUSE ANALYSIS


11-03-2024 MONDAY ROOT CAUSE ANALYSIS
12-03-2024 TUESDAY MANAGING ACCESS CONTROL FOR PROCESS APPS
13-03-2024 WEDNESDAY MANAGING ACCESS CONTROL FOR PROCESS APPS
14-03-2024 THURSDAY APPLICATIONS
15-03-2024 FRIDAY APPLICATIONS

CHAPTER-1
2
INTRODUCTION
1.1 Introduction of Celonis
Celonis is a German data processing company that offers software as a service (SaaS) to
improve business processes. It is headquartered in Munich, Germany. Celonis is the global
leader in execution management. The Celonis Execution Management System (EMS)
provides companies a modern way to run their business processes entirely on data and

intelligence.

1.2 History of the company


● Celonis was founded in 2011 by Alex Rinke, Bastian Nominacher, and Martin Klenk as a
spin-off from the Technical University of Munich.
● In 2012, Celonis joined the SAP Startup Focus program, an accelerator for analytics
startups building new applications on the SAP HANA platform.
● In July 2015, Celonis signed a reseller agreement with SAP. It has since been offered by
SAP as Celonis Process Mining by SAP. It was the first company from the SAP Startup
Focus program to sign a reseller agreement with SAP.
● It has its headquarters in Munich, Germany and currently houses over 3000+ employees.
● Celonis is the first commercial process mining company

1.3 Products and Services


Celonis provides Process Mining and Execution Management Software. The Celonis EMS helps
you not just understand your processes, but to run your entire business on data and intelligence. It
provides capabilities for Real-Time Data Ingestion, Process and Task Mining, Planning and
Simulation, Visual and Daily Management, and Action Flows. The EMS tool provides the
following services:
● Real-Time Data Ingestion
● Process and Task Mining
● Planning and Simulation
● Action Flows
● Daily and Visual Management
The company aims to revive businesses by bettering their processes with increased revenue and
lower process cycles. It provides a plethora of services to companies dealing with banking,
utilities, consumer goods, healthcare, insurance, public sector, telco, retail and many more. Some
3
notable customers of Celonis are- Johnson&Johnson, Dell, iFood, KraftHeinz, Lufthansa, ABB,
Vodafone and Uber.

1.4 About the course


Process Mining Fundamentals for Students The track "Process Mining Fundamentals for
Students" gives you encompassing insights into both the theoretical and applied foundations
around Process Mining.
● Introduction to PMF Tracks and Process Mining
● Review and Interpret Analyses Learn the use of Variant explorer, Process explorer, Charts and
Tables, Conformance Checker
● Build Analyses Basics Create Analyses sheet, configure tables and charts, Configure single
KPI, selection and Design components.
● Case Study Case study on: Pizzeria Mamma Mia
● Identify and Realize Execution gaps Frame value, Sustain value

CHAPTER-2
PROCESS MINING
4
Types of Process Mining
Process mining typically includes three types of analysis:
 Discovery: Automatically creating process models from event logs without any pre-
existing models. It helps reveal the true nature of processes.
 Conformance Checking: Comparing the actual process (as derived from event logs) with
a pre-defined model to identify deviations or compliance issues.
 Enhancement: Extending or improving existing process models based on the event data
to optimize performance, remove bottlenecks, or improve efficiency.

2.1 Process Mining Fundamentals


Digital transformation is a key driver that businesses focus on today. In order to start the digital
transformation, transparency is a prerequisite, which can be achieved with process mining.
Understanding actual and real processes across various industries and organizational departments
allows for sustainable efficiency improvement, driving transformation and creating value. Process
mining provides a means of generating data-driven transparency on how business processes are
executed within companies. This helps companies to identify bottlenecks in their process chains,
identify non-conforming variations and thus, they can counteract and improve their processes. In
order to make use of process mining, there needs to be at least three crucial data properties in
place. The data must comprise a unique identifier (ID), an activity name as well as a timestamp
tied to the respective activity name. This is called the event log. After data-specific
transformations to the event log, the activity table is generated. The activity table contains all
relevant process tasks along with its ID, timestamp and optionally some more specific
information such as whether the activity was system generated or human-made, or the location of
the activity. Each new process mining process requires a mapping of the activity names (the
ellipses seen in a process mining analysis) to the underlying timestamps within the data set.
Mapping is a complex process, sometimes lasting for months, and involves the expert knowledge
of the process owners as well of the IT architects. Furthermore, it is necessary to come up with a
distinct ID, which can be traced for the entire process.
Process mining is a widely used technique for different problems, varying from purchase process
analytics to supply-chain analytics to software process mining . Process mining will give a
holistic and transparent overview of the entire crew management process with the recorded data
comprising an ID, activity name, and timestamp. The vast amount of data is statistically analyzed
as well as visualized and thus, process mining provides an extensive view on how the crew
management process is enforced. Process mining enables organizations to analyze and evaluate
millions of data points and helps to better understand the process end-to-end. Due to this process

5
transparency it is possible to analyze bottlenecks and counteract against the identified problems.
Celonis software allows to intuitively explore the business process, checks for conforming cases,
and has additional functionalities, like a machine learning workbench or an action engine that
proactively suggests process improvements. This work will make use of Celonis process mining
to analyze crew management data and to derive action potentials for process improvements. A
Celonis on-premise 4.6 version was used to build every process mining analysis

2.2 Process Mining Implementation


This chapter elaborates on how the process mining project for the crew management process has
been implemented within Lufthansa CityLine (CLH). All following tasks have been performed
on the CLH SQL Server with SQL statements to create all tables necessary for our analysis. Due
to the setup of the CLH SQL Server, the Celonis on-premise system can connect to the SQL
server. Hence, whenever we create new tables on the CLH SQL server with a specific schema
name, they become available in Celonis as well. Fortunately, the CLH team already had a
database with all relevant raw tables and data. Most of the tables had the crucial data properties of
ID, activity code and timestamp in place, which are needed for process mining. Furthermore, we
also received a rough mapping of the activity codes with their corresponding activity names from
the CLH team. In conclusion, the raw tables include a TLC, a timestamp including begin date and
begin time, and an activity code. The TLC is a pseudonymized employee number. This
information allows us to track a crew member across all their assigned tasks for each duty day
without including any personal information about the crew member. A similar process mining
model has already been validated with a subsidiary of Lufthansa. Thus, it was possible to benefit
from their project output, for example through templates for analyses. However, there were some
differences not only in the database itself but also in the data structure. Specifically, our raw
tables contained different columns and some tables employed in their process mining model were
not available to us.

2.3 Data Analysis and Preprocessing


In order to setup a process mining model it is important to understand the underlying data.
Therefore, we worked with a MySQL database containing crew management data to get detailed
insights. The database of CLH for crew management is called Netline. Within the Netline
database there are 8 different tables with information ranging from hotel bookings to vacation or
sick days of employees, up to the respective flight events. The data we received spans from mid
2018 to now. We performed data cleaning by categorizing columns into larger categories, which
also accounted for human errors made in recording data (e.g. spelling errors). This was done in

6
accordance with CLH experts so assure categories correctly represented the data. We also added
tables not included in our Netline database to make our data more comprehensible (e.g. adding
airport coordinates to map airports). In order to understand the data, SQL queries have been
performed to understand the data structure and data type of the underlying data. As an example, a
part of the content of the raw event table from Netline is illustrated below. This table was one of
our main sources of information. As visible in figure 4, the Netline event table contains many
columns. Of biggest importance to us was TLC, begin date and begin time as well as the column
type. While the TLC allowed us to identify a crew member, the begin date and begin time helped
us to chronologize events. The column types were used to specify the type of event. The types are
very general and contain 8 categories, including Leg, Absence or Simulator. Within CLH a Leg
refers to a completed flight. Also visible from figure 4 are the differences in data types. In the
Netline table data points are stored as integers, varchars and datetime formats.
In order to get the data into the right shape for process mining it was essential to perform extract,
transform and load (ETL) tasks on the data. Not all tables could be used in their raw format,
therefore we had to create a set of new tables based on the raw tables. For this, we extracted the
necessary columns, thereby also reducing the volume of our set of tables. Next, we performed
transformations to standardize the tables to achieve a consistent data set. First, inconsistencies in
the format of dates in the tables would have lead to inconsistent Case Keys. Case Keys are the
unique IDs to trace employees across their duty days. We create one case key for every duty day
of every employee. To prevent inconsistencies in our Case Keys, we standardized the date format
with one of the SQL-supported formats: yyyy-mm-dd. After standardizing the date format, we
concatenated it with the time. This allows for a more detailed process flow in the data model,
allowing us to track each employee across a given day, with a chronological order of all events
due to the precise timestamp. The time was stored as a normal integer value such as ”245”
representing 02:45 am or ”2309” symbolizing 11:09 pm. The dates and the times were
concatenated in the final datetime format yyyy-mm-dd HH:mm:ss. Furthermore, it was essential
for all tables to have one column in common to be able to join the tables together and infer their
relationship to one another. This was accomplished by adding the Case Key column to most of
the tables. The Case Key was a concatenation of the TLC, which is an integer value, with the
begin date. Thus, the Case Key has the format TLC yyyy-mm-dd. Tables such as the
Cancellations table, which contains cancelled flights, did not need to be joined via the Case Key,
but rather through the flight number. This resulted in a consistent data set across eight tables
containing all important information for every event in the crew management process.

2.4 Creation of the Celonis Data Model

7
2.4.1Creation of the Activity and Case Table
Based on the tables we extracted from the Netline database, we were able to create the cases table
and the activity table; the two most important tables in a process mining analysis. The activity
table contains every activity executed. Our activity table contains over 3.4 million distinct
activities. Based on the mapping provided to us by CLH, the initial activity table contained the
following columns: ID, activity name, timestamp, flight, departure and arrival. A template for the
activity table, which was used in the past for another process mining project in one of CLHs
subsidaries, contained numerous restrictions: i.e. only specific types of flights, duties or captains
with a specific rank. This served as the basis for our first analysis of the crew management
process. However, after receiving feedback from experts from CLH, especially regarding the
internal codes used at CLH, we wanted to add more details to enhance our data model. It was an
iterative process to add more activities as information was made available to us over time. We
wanted to present the crew management process in as much detail as possible in our process
mining analysis. In the final analysis, many activities from the template were removed and
replaced by more precise ones which better represent the process flow. Therefore, the final
activity table contained additional columns such as longitude and latitude coordinates of all
European airports targeted by CLH and a duty code category, which describes the duty activity in
more detail such as proceeding flight, vacation, sick days etc. This greatly increased the amount
of detailed information we were able to provide in our analyses, as we were able to target specific
events through the duty code categories.
The cases table contains every case we examine in the process mining model. Our cases table
consists of over 1.3 million distinct cases. Every case contains all activities one crew member
completes during one duty day. Thus, the cases table has one entry for each crew member on
every duty day. Therefore, each case can be identified via its distinct Case Key, meaning the
cases table creates a list of all distinct case keys. An inner join of the cases table with the activity
table has been performed for each of the activities to achieve a consistent data set. This means we
keep only the activities related to cases in our cases table. Thereby we created a 1:N relationship
between the cases table and the activities table. The cases table contains further information, such
as the home base of a TLC, the TLC itself as well as cancelled flights. This enabled our team to
build relationships between the cases table and different tables. These relationships were
necessary for a detailed analysis of the crew management process.

2.4.2 Creation of the Data Model


Since all relevant tables were created in conformance to Celonis guideline, it was possible to
upload the tables directly into Celonis. There, the tables were connected by common data points

8
to obtain the final data model. The data model with all relationships between tables is depicted
below

Celonis only supports 1:1 or 1:N relationships among the tables. The relationships among the
tables we inferred were verified by the CLH IT experts in a separate meeting. This ensured a
proper setup of the data model, which was imperative to achieve a meaningful analysis. Due to
the connections we realized among the tables it was possible to retrieve more detailed
information about the activities, such as the name of a hotel where a crew member stayed or from
which city a crew member took a proceeding flight to start their duty.
This data model, with all its internal relationships, is used within Celonis to visualize the process
flow and to create the process analyses. Hence, the data model is a crucial part of process mining.
During data model creation, we had to manually reload the model in Celonis whenever we
implemented changes in the SQL server. However, once the data model was stable and any
inconsistencies had been removed, we were able to automate data loads and ensure our analyses
were always up-to-date.

2.4.3 PQL introduction and JSON import


After the data model has been successfully loaded into Celonis, the data becomes accessible in
the Celonis front end, which are the analyses. Every analysis has to be set up by data engineers.
The analyses are based on an own programming language called Process Query Language (PQL).
PQL focuses on the process specific syntax but shows similarities to SQL. An example for a
statement that would calculate the amount of a specific process flow from activity 1 to activity 2
would look like the following:
CASE WHEN PROCESS EQUALS "Activity 1"
TO ANY TO "Activity 2"
THEN 1.0 ELSE 0.0
END
With PQL, KPIs can be defined and retrieved from the data model. Once a KPI has been

9
specified, one can make use of the KPI and visualize it on the Celonis front end. Therefore, the
analyses serve as dashboards to see the historic changes of that respective KPI over the recorded
time inside the data model or as a single digit on a daily basis - it depends on the setup by the data
engineer who designs the analysis. Celonis allows for arbitrarily many KPIs and analysis sheets.
Due to the similar work of the subsidiary of Lufthansa, it was possible to transfer the analysis
sheets from the subsidiary as a template and adapt it to our use cases. However, the analysis was
hosted in the cloud version of Celonis, and CLH has an on-premise Celonis version in their entire
IT landscape. Nevertheless, it is possible to transfer analyses across different versions of Celonis.
Hence, we exported a JSON file containing all the analysis descriptions from the cloud system
and imported it into our on-premise version. This also allowed us to replace column
names,column values, and color schema to fit CLH design inside the JSON file to reduce the
amount of necessary changes. From that point on, additional analysis sheets were created with
PQL statements in order to adapt the analysis sheets to the needs of our project.

2.5 Creation of the Celonis Dashboards


2.5.1 Standby Crew Utilization KP
In order to check the standby crew utilization within the given data, we calculated how many
people were activated given that they were assigned as standby. Hence, we calculated the ratio of
cases where a standby was activated in comparison to the total amount of standby crew as the
following PQL formula clarifies:
SUM(CASE WHEN
PROCESS EQUALS "Publish Plan: Standby" TO ANY TO "Legs" OR
PROCESS EQUALS "Publish Plan: Standby" TO ANY TO "Simulator" OR
PROCESS EQUALS "Standby" AND PROCESS EQUALS "Legs"
THEN 1.0
ELSE 0.0
END) /
SUM(MATCH_ACTIVITIES(NODE["Publish Plan: Standby", "Standby]))
This formula counts all activated standbys, which had to fly (= ”Legs”) and divides this sum by
the number of published standby plans. For example, a captain is assigned as a standby for a
given date. This means that the captain needs to be available at the respective airport within 3
hours after being notified that he needs to fly because a captain assigned to fly is unavailable.
This would be reflected as one case in our data model where the standby member was activated.
Thus, we aggregate all such activities, where any standby crew member got activated, and divide
it by the total number of standbys. This ratio is then being used to monitor the standby crew

10
utilization across the recorded time frame. Furthermore, it is also possible to conduct a root cause
analysis, to better understand why the activations took place, i.e. sickness of a colleague or long
delay of another flight. This creates transparency and can aid in increasing efficiency and
effectivity in the long-term.

2.5.2 Notification Automation Rate KPI


This KPI monitors the automation rate for notifications in the crew control sub-process. To
calculate this KPI it is essential to understand how an activity is created in Celonis. Whenever an
activity is system generated, the activity name will have CIT in its respective name. CIT means
crew information terminal and is the CLH system that sends push notifications to employees
affected by schedule changes via their smartphone. Thus, inside the activity table there is a data
point to monitor whether a specific activity was performed manually by an employee (=’User’) or
automatically by the system (=’CIT’). This specific value can be used as an indicator for the
automation rate. This is accomplished by aggregating all automated activities and dividing it by
the total number of activities. Hence, the KPI notification automation rate is calculated by the
following PQL formula.
SUM( CASE WHEN
PU_COUNT("_CEL_CP_CASES", "_CEL_CP_ACTIVITIES"."ACTIVITY_EN",
"_CEL_CP_ACTIVITIES"."ACTIVITY_EN" LIKE ("%notified by CIT%")) > 0
THEN 1.0
ELSE 0.0 END) /
SUM(CASE WHEN
PU_COUNT("_CEL_CP_CASES", "_CEL_CP_ACTIVITIES"."ACTIVITY_EN",
"_CEL_CP_ACTIVITIES"."ACTIVITY_EN" NOT LIKE ("%notified by CIT%")) > 0
THEN 1.0
ELSE 0.0
END)
This KPI enables CLH to get an overview of how automated the crew member notification
process is after changes in the schedule occur. A higher automation rate in the notification
process means less manual labor and less time invested. This can increase productivity in the
crew control department.

11
CHAPTER-3
MODULES
3.1 Process Mining
Process mining applies data science to discover, validate and improve workflows. By combining
data mining and process analytics, organizations can mine log data from their information
systems to understand the performance of their processes, revealing bottlenecks and other areas
of improvement. Process mining leverages a data-driven approach to process optimization,
allowing managers to remain objective in their decision-making around resource allocation for
existing processes. Process mining focuses on different perspectives, such as control-flow,
organizational. case, and time. While much of the work around process mining focuses on the
sequence of activities---control-flow---the other perspectives also provide valuable information
for management teams.
In this module we learned how to:
 Data transformation
 Data analysis
 Continuous monitoring

3.2 Process Mining (Cloud)


With the Process Mining service in Automation Cloud, you can create new process apps based
on process-specific app templates. An app template contains a predefined set of dashboards and
KPIs for process analysis and can be used as the starting point for creating your process apps. If
available, an app template can include a built-in connector for a specific combination of a process
and source system. It offers out-of-the box app templates for several processes and source
systems that you can use as the starting point for creating your process apps. You can customize
these app templates to your business needs and publish them with a set of dashboards and KPIs to
enable business users to monitor and analyze the processes in detail. When creating a process
app. you can upload data from .csv or tsv files, or you can set up a connection to a source system
using the extraction tools CData Sync or Theobald Xtract Universal. You can also use
DataBridgeAgent to use custom
mvp connectors to upload data from your source system.
In this module we learned how to:
 App templates
 Extracting and loading data

12
3.3 Processes Mining transparency
Process mining is a process management technique. It aims to discover, monitor and improve
process flows by extracting readily available knowledge from information systems event logs.
Process mining provides companies with complete visibility into how processes really work.
With these insights, companies can then identify opportunities for process optimization. Process
mining involves several steps.
The automated process discovery-extraction of process models from an event log.
&
The conformity check-monitoring deviations by comparing model and protocol.
In this module we learned how to:
 Automation Process Discovery
 Conformity Check
 Organization Mining

3.4 Process Mining is the MRI for processes


Process mining technology could also be compared to magnetic resonance imaging (MRI)
technology, which collects information from the body's cells to create an image only in a business
environment. Doctors then use this MRI image to diagnose health conditions. Process mining
works on a similar principle: It collects data from the smallest part of process activities and
assembles it into a picture that companies can use to diagnose the state of their workflows.
Process mining is changing the way companies operate and manage their business operations. In
their quest for process quality, companies can use process mining to really get to know their
process, evaluate it against the ideal process model and optimize it as needed.
In this module we learned how to:
 MRI Technology
 Risk of Confusion

3.5 Mining Algorithm


The mining algorithm determines how process models are created. The best known categories
are:
 Deterministic algorithms: Determinism means that an algorithm produces only defined
and reproducible results. It always delivers the same result for the same input. The
deterministic algorithm was one of the first algorithms capable of handling concurrency.

13
It takes an event log as input and computes the order relation of the events contained in
the log.

 Heuristic Algorithms: Heuristic mining also uses deterministic algorithms. However,


they refer to the frequency of events and traces to reconstruct a process model. A common
problem in process mining is that real-world processes are very complex and their
discovery leads to complex models. This complexity can be reduced by neglecting rare
paths in the models.
 Genetic Algorithms: They use an evolutionary approach that mimics the process of
natural evolution. They are not deterministic. Genetic mining algorithms follow four
steps: Initialization, Selection, Reproduction, and Termination.
In this module we learned how to:
 Deterministic Algorithm
 Heuristic Algorithm
 Genetic Algorithm

3.6 Starting a Project in mining


To start a project in the stream of process mining one need to follow some basic requirements
they are classified as follows.
 Determine Problem: Identify the problem of importance to the business that can
realistically be addressed with process mining.
 Identify the Data: Identify the data sources that need to be fully understood to address
the business process issues under consideration
 Setting pilot project: Setup a pilot project to prove the potential value of a process
mining solution
 Accept Truth: Accepting the results of the analysis, as process mining provides, among
other things, a clear picture based on facts.
In this module we learned how to:
 Determining problem
 Defining the Data
 Pilot Project
 Accepting Truth

3.7 Industrial Usage of Mining


 Production: In the manufacturing industry, timely and accurate delivery to a customer is

14
the goal : When a company has multiple factories in different regions, there are usually
differences between the reliability of deliveries. It is fairly easy to see that they exist, but
it is more difficult to understand exactly where or why they are happening. Process
mining can be used to compare the performance of different locations, down to individual
process steps, including duration, cost, and the person performing the step. All event data
available in the systems is suitable for use. In this way, facts can be generated.
 Banking and finance: In the financial sector, it is important to comply with rules and
regulations and to be able to provide evidence of this. By using the event data from the
systems. individual cases can also be visualized as a process flow.
In this module we learned how to:
 Production
 Financing
 Telecom

3.8 Process Mining Software's


A process mining solution should have strong detection capabilities. It should be able to search
event logs to track what employees are actually doing and then create an appropriate process
model by generating process maps of the entire business flow. In addition, the solution should
have robust conformance checking that analyzes event logs to ensure that actions match process
models. Third, a process mining solution needs performance analysis and improvement
capabilities that analyze potential inefficiencies within an event log to determine if and how they
can be improved, and then make improvements based on real process data. Ultimately, though,
which software is right for the job depends on the size of the company, its business
needs, and its goals.
In this module we learned how to:
 Process Detection
 Conformity Testing
 Performance Analysis

3.9 Software Key Functions


If your selected process mining software fulfills these key functions, then you have already made
a good choice. However, you should always keep in mind that your company's ability to measure,
monitor and optimize business processes has a direct impact on revenue and customer
satisfaction. Therefore, it is important to choose the right process mining solution wisely to
ensure that all business goals are optimally met. If necessary, an expert can also be consulted.

15
Identify bottlenecks & process optimization opportunities Provide insights into failed process
steps Ensure end-to-end view of the entire process Monitor performance indicators in real time
Perform data eleansing Compliance analysis & gap analysis Provide continuous business process
monitoring in real time improve process model.
In this module we learned how to:
 Identifications
 Optimization

3.10 Process Mining Software Providers


The following are the Process Mining Software Providers in the Market

In this module we learned how to:


 Different Software Providers

Key modules of the Process Mining:


1. Process Discovery
 Purpose: Automatically create a visual representation of how business processes actually
run based on event log data.
 Functionality: Analyzes raw data to generate process models, uncovering the real
workflows, variations, and sequences of activities within an organization.
 Key Use: Understanding the current state of a process without needing predefined
models.
2. Conformance Checking
 Purpose: Compare the actual execution of processes with predefined models or standards
to identify deviations and ensure compliance.
 Functionality: Highlights mismatches, process variations, and non-compliance issues,
helping organizations detect where processes deviate from the expected flow.
 Key Use: Ensuring that processes follow regulatory requirements and business rules.
3. Performance Analysis
 Purpose: Measure and analyze the performance of processes based on key metrics like
time, cost, and frequency of activities.
 Functionality: Tracks KPIs, process cycle times, and identifies bottlenecks,
16
inefficiencies, and delays to improve overall process performance.
 Key Use: Monitoring process efficiency and making data-driven decisions for process
optimization.

4. Enhancement (Process Improvement)


 Purpose: Extend and improve existing process models based on insights from data to
make the processes more efficient.
 Functionality: Identifies opportunities for streamlining, eliminating inefficiencies, or
optimizing resource allocation, enabling continuous process improvement.
 Key Use: Proactively improving workflows based on real-time data analysis.
5. Predictive Analysis
 Purpose: Use historical data to forecast future process behaviors, delays, and potential
outcomes.
 Functionality: Applies machine learning algorithms to event logs to predict issues,
process delays, or other risks before they happen.
 Key Use: Predicting and preventing future inefficiencies or failures in the process.
6. Robotic Process Automation (RPA) Identification
 Purpose: Identify automation opportunities by analyzing repetitive tasks and manual
activities within processes.
 Functionality: Maps out tasks that are rule-based and repetitive, highlighting where
Robotic Process Automation (RPA) can be applied for efficiency gains.
 Key Use: Supporting automation initiatives by discovering processes suitable for RPA
implementation.
7. Root Cause Analysis
 Purpose: Diagnose the underlying reasons for inefficiencies, bottlenecks, or deviations in
the process.
 Functionality: Explores relationships between process steps, allowing organizations to
understand why certain problems are occurring.
 Key Use: Identifying and addressing the root causes of process inefficiencies to make
targeted improvements.
8. Simulation
 Purpose: Simulate different process scenarios to test the impact of potential changes
before implementing them in real-time.

17
 Functionality: Provides a virtual environment to model various changes in the process
and evaluate their impact on performance metrics.
 Key Use: Experimenting with changes to improve processes without the risk of disrupting
current operations.

9. Process Benchmarking
 Purpose: Compare the performance of specific processes against industry standards or
internal benchmarks.
 Functionality: Assesses process performance in relation to competitors or internal goals,
helping identify areas where the organization is underperforming.
 Key Use: Driving continuous improvement by measuring processes against best practices
or competitors' performance.
10. Process Visualization
 Purpose: Create interactive, easy-to-understand visual representations of process flows.
 Functionality: Generates process maps, diagrams, and dashboards to help users
intuitively explore and understand their business processes.
 Key Use: Facilitating communication and understanding of complex processes among
stakeholders.

18
CHAPTER-4
TECHNOLOGY
Process Mining Technologies
Process mining applies data science to discover, validate and improve workflows. By combining
data mining and process analytics, organizations can mine log data from their information
systems to understand the performance of their processes, revealing bottlenecks and other
areas of improvement. Process mining technologies are tools and techniques used to analyze,
monitor, and improve business processes by extracting knowledge from event logs available in a
system. These technologies bridge the gap between data science and business process
management (BPM) by allowing organizations to use data from their enterprise systems (such as
ERP, CRM, and workflow systems) to visualize how processes actually run.

4.1 App templates


With the Process Mining service in Automation Cloud, you can create new process apps based on
process-specific app templates. An app template contains a predefined set of dashboards and
KPIs for process analysis and can be used as the starting point for creating your process apps. If
available, an app template can include a built-in connector for a specific combination of a process
and source system.

4.2 Extracting and loading data


When creating a process app, you can upload data from .csv or tsv files, or you can set up a
connection to a source system using the extraction tools CData Syne or Theobald Xtract
Universal. You can also use DataBridgeAgent to use custom mvp connectors to upload data from
your source system.

4.3 Editing data transformations


Transformations are applied to the data stored in the database to make sure the data adheres to a
data schema which can be loaded in the Process Mining process app. In Process Mining, you can
customize the transformations to adapt them to your data schema.
19
4.4 Customized process apps
 Dashboard editor
After creating a process app from an app template, you can edit the dashboards to
customize the process app to your business needs. The Dashboard editor provides various
options to create different views, and to organize, group, and filter data.
 Data Manager
The Data Manager enables you to customize the data used in your process spp. With Data
Manager you can edit data fields and metries to change the display names used in your
app. Besides, you can toggle fields to be visible or not.

4.5 Root cause analysis


With Root cause analysis, you can compare the influence of case properties on a certain behavior
to find significant data influencers for specific process situations. A set of cases is defined based
on the period filter. This selection is called Reference cases. Within this set of cases, you can
select the behavior that you want to analyze

4.6 Managing access control for process apps


The Admin Console module enables you to manage access by assigning roles to users or groups.
The permissions model allows you to integrate all your employees using Process Mining based
on your business requirements.

Process Mining Technologies and Tools


 Celonis: One of the leading process mining platforms, Celonis provides process
visualization, real-time monitoring, and recommendations for process optimization.
 Disco (Fluxicon): A process mining tool that focuses on process discovery, analysis, and
performance tracking with an easy-to-use interface.
 UiPath Process Mining: Integrated into the UiPath automation platform, it offers
capabilities to discover, monitor, and improve automation and processes.
 QPR ProcessAnalyzer: A solution for advanced process mining, focusing on
conformance checking, root cause analysis, and performance measurement.

20
 Apromore: An open-source process mining tool providing features for process discovery,
conformance checking, and process enhancement.

CHAPTER-5
APPLICATIONS
Process mining applies data science to discover, validate and improve workflows. By combining
data mining and process analytics, organizations can mine log data from their information
systems to understand the performance of their processes, revealing bottlenecks and other areas
of improvement. Process mining is beneficial for many situations in large organizations. Areas
where process mining can be actively applied include the following
 Automation for RPA Success: Understanding actual processes, variations, and
opportunities is essential to ensure success in Robotic Process Automation (RPA)
projects.
 Comprehensive Process Reporting: Providing complete process KPIs and dashboards to
monitor and analyze the performance of a given process.
 Digital Transformation: Gaining insights into the "big picture" to understand how
businesses operate, prioritize key areas, and implement effective digital transformations.
 Scaling Optimization Across Operations:Expanding optimization efforts across
multiple business units and locations, supporting process control through the analysis of
process data.
 Effortless Process Capture: Capturing processes across the enterprise with minimal
human effort, ensuring comprehensive process visibility.
 Bottleneck and Inefficiency Identification: Detecting bottlenecks, deviations, and
inefficient processes, which can then be improved or automated.
 Continuous Improvement Monitoring: Continuously monitoring and measuring the
impact of process improvements to ensure sustained efficiency.
 Simplified Compliance and Auditing: Ensuring compliance by maintaining complete
audit trails, simplifying the auditing process.
 End-to-End Process Perspective: Delivering a full context and end-to-end view of
processes to support effective process improvements.
 Automation Opportunity Identification: Identifying the most valuable and effective
processes suitable for automation, ensuring optimal returns from automation investments.
 Business Process Discovery: Automatically generating real-time process models from
21
event logs to visualize how processes operate.
 Process Optimization: Identifying inefficiencies, bottlenecks, and redundant steps to
streamline workflows and improve performance.
 Compliance Monitoring: Ensuring business processes adhere to regulatory standards and
company policies by comparing actual performance with predefined models.

 Auditing and Risk Management: Enhancing internal audits by identifying deviations,


risks, and non-compliance areas.
 Performance Tracking: Monitoring KPIs (Key Performance Indicators) and analyzing
process performance over time to make data-driven decisions.
 Customer Journey Analysis: Understanding customer interactions and behaviors across
various touchpoints to improve customer experience and satisfaction.
 Predictive Process Analytics: Using historical data to predict future process outcomes,
detect potential issues, and prevent delays or failures.
 Process Automation Insights: Identifying opportunities for Robotic Process Automation
(RPA) by pinpointing repetitive, rule-based tasks suitable for automation.
 Supply Chain Management: Analyzing supply chain workflows to optimize logistics,
reduce lead times, and improve inventory management.
 IT Operations Optimization: Monitoring IT service management processes to ensure
timely incident resolution, reduce downtime, and improve overall service quality.
 Healthcare Process Improvement: Streamlining patient care workflows, reducing
waiting times, and improving hospital resource allocation.
 Fraud Detection: Identifying unusual patterns in financial transactions or loan processes
that may indicate fraudulent activity.
 Cost Reduction: Highlighting areas of resource overuse or inefficiencies that lead to
increased operational costs, enabling cost-cutting strategies.
 Employee Productivity Monitoring: Analyzing employee tasks and workflows to
improve productivity and allocate resources more effectively.

Learning outcomes
 Gain an overall understanding of basic Process Mining concepts.
 Become familiar with Mining core services and tools
 Learn the architectural principles of the process Mining.
 Understand and be able to explain Process Mining and compliance measures.
 Understand the Process Mining budget and pricing philosophy.
22
 Engage in hands-on practice to hone key skills. Learn the knowledge and skills required
to take the Process Mining Virtual internship Certified.

Conclusion
The Celonis Process Mining Virtual Internship has been an invaluable experience, providing
hands-on exposure to process mining, data analysis, and business process optimization.
Throughout the internship, I gained a deep understanding of how Celonis' powerful tools can be
used to identify inefficiencies and optimize business workflows. I learned to work with process
data, apply analytical techniques, and use Celonis' Action Engine and dashboards to recommend
data-driven improvements.
This experience honed my skills in data analytics, problem-solving, and critical thinking, as I
worked on real-life business cases, applying theoretical concepts to practical scenarios. I also
developed my ability to communicate insights clearly to stakeholders, which is essential for
driving meaningful change in business operations.
By doing this internship I learnt
 The importance of Process Mining.
 Tools that helps us to optimize our service costs.
 Software Production and Estimation
 Processing the Huge data.
 And other different Services that are provided in Mining.
Overall, this internship has equipped me with valuable skills and knowledge, fostering a strong
foundation in process mining and preparing me for a career in data analysis, process
improvement, and business intelligence. The experience of working with cutting-edge technology
in a virtual setting has also enhanced my adaptability and self-management skills, which will be
instrumental in my future endeavors.

23

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy