0% found this document useful (0 votes)
255 views335 pages

Advanced Analytics Platform Technical Guide R19

Uploaded by

Hải TX
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
255 views335 pages

Advanced Analytics Platform Technical Guide R19

Uploaded by

Hải TX
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 335

Advanced Analytics Platform Technical

Guide

Information in this document is subject to change without notice.


No part of this document may be reproduced or transmitted in any form or by any means, for any purpose, without the express
written permission of TEMENOS HEADQUARTERS SA.
© 2018 Temenos Headquarters SA - all rights reserved.
Advanced Analytics Platform Technical Guide

Table of Contents
Disclaimer.................................................................................................................................. 9
Introduction ............................................................................................................................ 10
Intended Audience .................................................................................................................. 11
Application Overview .............................................................................................................. 12
Application Architecture.......................................................................................................... 14
Advanced Analytics Data Flow ........................................................................................................ 14
Data ExStore ................................................................................................................................. 16
Core Analytics ETL Flow ................................................................................................................. 18
Extract .......................................................................................................................................... 19
Transform ..................................................................................................................................... 20
Load ............................................................................................................................................. 20
Analytics in Azure PaaS .................................................................................................................. 21
InsightImport ......................................................................................................................... 23
Overview ....................................................................................................................................... 23
Specific Features / Functions .......................................................................................................... 24
Technical Details ........................................................................................................................... 25
Architecture .................................................................................................................................. 25
Technical Components ................................................................................................................... 25
Additional Information ................................................................................................................... 69
Configuration................................................................................................................................. 74
Configuring Source Schema (Add/Remove/Configure Tables) ........................................................... 74
Running the procedures / Configuring the Analytics ETL Job ............................................................ 75
Logging......................................................................................................................................... 75
InsightLanding ........................................................................................................................ 76
Overview ....................................................................................................................................... 76
Specific Features / Functions .......................................................................................................... 77
Technical Details ........................................................................................................................... 78
Architecture .................................................................................................................................. 78
Technical Components ................................................................................................................... 79
Configuration................................................................................................................................. 97
Configuring ExtractList ................................................................................................................... 97
InsightSource .......................................................................................................................... 98
Overview ....................................................................................................................................... 98
Specific Features / Functions .......................................................................................................... 98
Technical Details ........................................................................................................................... 99
Architecture .................................................................................................................................. 99

Page 2 | 335
Advanced Analytics Platform Technical Guide

Technical Components ................................................................................................................. 100


Configuration............................................................................................................................... 102
Source Date Selection .................................................................................................................. 102
Running the SQL Stored Procedures ............................................................................................. 103
InsightETL ............................................................................................................................. 105
Overview ..................................................................................................................................... 105
Rules Engine ............................................................................................................................... 105
GDPR .......................................................................................................................................... 105
Data Quality ................................................................................................................................ 105
Logging, Toolkit and Performance Enhancement ........................................................................... 105
Multi-threading ............................................................................................................................ 106
Multi-language reports configuration ............................................................................................. 106
Specific Features / Functions ........................................................................................................ 106
Technical Details ......................................................................................................................... 107
Architecture ................................................................................................................................ 107
Technical Components ................................................................................................................. 109
Configuration............................................................................................................................... 155
Analytics ETL System Dates .......................................................................................................... 155
Rules Engine ............................................................................................................................... 156
Column Splits .............................................................................................................................. 156
Maintaining InsightETL ................................................................................................................. 158
Lookup Rules............................................................................................................................... 158
Banding Rules ............................................................................................................................. 161
Using Data Manager .................................................................................................................... 162
InsightStaging ...................................................................................................................... 164
Overview ..................................................................................................................................... 164
Specific Features / Functions ........................................................................................................ 164
Technical Details ......................................................................................................................... 166
Architecture ................................................................................................................................ 166
Technical Components ................................................................................................................. 166
v_Source Views ........................................................................................................................... 173
System Views .............................................................................................................................. 173
Stored Procedures ....................................................................................................................... 174
Configuration............................................................................................................................... 186
Creating v_Source Views .............................................................................................................. 186
Update Order .............................................................................................................................. 188
Adding Systems ........................................................................................................................... 189

Page 3 | 335
Advanced Analytics Platform Technical Guide

Adding Tables from a new Source System ..................................................................................... 189


Preparation ................................................................................................................................. 190
InsightLanding ............................................................................................................................ 190
InsightSource: ............................................................................................................................. 191
InsightETL (Rules Engine) ............................................................................................................ 192
InsightStaging ............................................................................................................................. 192
InsightWarehouse........................................................................................................................ 192
InsightWarehouse ................................................................................................................. 193
Overview ..................................................................................................................................... 193
Specific Features / Functions ........................................................................................................ 193
Technical Details ......................................................................................................................... 195
Architecture ................................................................................................................................ 195
Data Model.................................................................................................................................. 196
Abstraction Views ........................................................................................................................ 198
Technical Components ................................................................................................................. 199
Configuration............................................................................................................................... 205
Configuring Data Dictionary Table ................................................................................................. 205
SQL Stored Procedures ................................................................................................................ 209
Analytics ETL Procedures ............................................................................................................. 209
Maintenance Procedures .............................................................................................................. 209
Maintenance Agent Job ................................................................................................................ 210
InsightSystem ....................................................................................................................... 211
Overview ..................................................................................................................................... 211
Specific Features / Functions ........................................................................................................ 211
Budget Data .......................................................................................................................... 212
Overview ..................................................................................................................................... 212
Budget Data Granularity and Structure .......................................................................................... 212
Importing Budget Data into InsightLanding ................................................................................... 213
Budget Source Views ................................................................................................................... 214
Budget Data in InsightWarehouse ................................................................................................. 216
Analytics ETL Processing ....................................................................................................... 217
Overview ..................................................................................................................................... 217
Technical Details ......................................................................................................................... 217
Technical Components ................................................................................................................. 217
Process Insight Cubes – By Partiton ..................................................................................... 227
Overview ..................................................................................................................................... 227
Process Data ExStore ............................................................................................................ 228

Page 4 | 335
Advanced Analytics Platform Technical Guide

Overview ..................................................................................................................................... 228


DW Online Processing ........................................................................................................... 231
Overview ..................................................................................................................................... 231
Run Reports Subscriptions .................................................................................................... 232
Overview ..................................................................................................................................... 232
InsightLanding CSI Purge ..................................................................................................... 233
Overview ..................................................................................................................................... 233
InsightLanding CSI Defragmentation ................................................................................... 234
Overview ..................................................................................................................................... 234
InsightWarehouse CSI Defragmentation .............................................................................. 235
Overview ..................................................................................................................................... 235
InsightETL Maintenance ....................................................................................................... 236
Overview ..................................................................................................................................... 236
KPI Cache Maintenance ........................................................................................................ 237
Overview ..................................................................................................................................... 237
Multi-Company (Temenos Core banking Specific) ................................................................ 238
Core Banking Multi-Company Data ................................................................................................ 238
Multi-Company Joins in Analytics .................................................................................................. 238
Core banking File Types ............................................................................................................... 238
JOIN features .............................................................................................................................. 242
Core banking Company Metadata ................................................................................................. 243
Primary and Foreign Natural Keys ................................................................................................. 245
Multi-Currency (Temenos Core banking Specific) ................................................................ 246
V_Source views ........................................................................................................................... 246
V_SourceCurrencyBS ................................................................................................................... 246
V_SourceBranchBS ...................................................................................................................... 247
InsightETL................................................................................................................................... 247
Data Model.................................................................................................................................. 248
Reporting Views .......................................................................................................................... 249
Multi-Tenant Deployment ..................................................................................................... 250
Tenant assignment to databases .................................................................................................. 250
Data values ................................................................................................................................. 250
Scheduled Jobs............................................................................................................................ 250
User Roles ................................................................................................................................... 250
Publishing ................................................................................................................................... 250
Cubes ......................................................................................................................................... 251
Reports ....................................................................................................................................... 251

Page 5 | 335
Advanced Analytics Platform Technical Guide

Application .................................................................................................................................. 251


General Ledger ...................................................................................................................... 252
Data Model.................................................................................................................................. 253
R19 Model Updates ...................................................................................................................... 253
GL Adjustments ........................................................................................................................... 253
DW Export .................................................................................................................................. 255
InsightImport .............................................................................................................................. 256
InsightLanding and InsightSource ................................................................................................. 256
InsightStaging ............................................................................................................................. 256
InsightWarehouse........................................................................................................................ 257
Reports ....................................................................................................................................... 260
GL Consolidated .......................................................................................................................... 261
InsightStaging ............................................................................................................................. 261
InsightWarehouse........................................................................................................................ 262
Cubes ......................................................................................................................................... 263
Retail Analytics Contents and Relationships ................................................................................... 263
General Data Protection Regulation (GDPR) ........................................................................ 266
High Level Solution ...................................................................................................................... 267
Overview ..................................................................................................................................... 267
Right to Erasure Processing .......................................................................................................... 267
Consent Management .................................................................................................................. 268
Considerations/Dependencies ....................................................................................................... 269
Right to Erasure Detailed Design .................................................................................................. 270
Data Model.................................................................................................................................. 270
Metadata..................................................................................................................................... 270
Analytics Right To Erasure Logic ................................................................................................... 277
Executing Rules to Erase Sensitive Customer Columns ................................................................... 278
Agent Job.................................................................................................................................... 279
Interface and Usage .................................................................................................................... 281
Configuration .............................................................................................................................. 282
Consent Management .................................................................................................................. 287
Datamodel .................................................................................................................................. 287
Table Details ............................................................................................................................... 287
Metadata..................................................................................................................................... 288
Rule Definitions ........................................................................................................................... 288
Configuration .............................................................................................................................. 288
Platform Configuration and Customization .......................................................................... 294

Page 6 | 335
Advanced Analytics Platform Technical Guide

Importing a New Table with SQL Script ......................................................................................... 294


Creating CSV file and coping it to the database server ................................................................... 294
Duplicate-checking ...................................................................................................................... 294
Updating SourceSchema in the InsightImport Database ................................................................. 295
Updating ExtractList in the InsightLanding Database ...................................................................... 295
Executing Process Data ExStore or Analytics ETL ........................................................................... 296
Post-Update Checks ..................................................................................................................... 296
Adding Attribute Calculation with SQL Script – Split Operation ........................................................ 300
Pre-Update checks ....................................................................................................................... 300
Updating the AttributeCalculations table in the InsightETL Database ............................................... 301
Executing the Column Split ........................................................................................................... 302
Post-Update Checks ..................................................................................................................... 303
Adding Attribute Calculation with SQL Script – Calculation .............................................................. 303
Pre-Update checks ....................................................................................................................... 303
Updating the AttributeCalculations table in the InsightETL Database ............................................... 304
Executing the Column Calculation ................................................................................................. 305
Post-Update Checks ..................................................................................................................... 306
Adding a Budget Source System with SQL Script............................................................................ 306
Checking Budget data format ....................................................................................................... 306
Configuring Budget table load in InsightLanding ............................................................................ 307
Customizing and testing Analytics ETL Agent Job ........................................................................... 309
Test changes in Analytics ETL ....................................................................................................... 311
Configuring Budget data processing in InsightStaging .................................................................... 312
v_Source Views ........................................................................................................................... 313
Executing Analytics ETL ............................................................................................................... 314
Post-Update Checks ..................................................................................................................... 315
GL Reports in the Analytics Web Front End.................................................................................... 316
Adding a New Column from a Source System or from InsightETL to the Data Warehouse with SQL Script
(Fact Column) ............................................................................................................................. 317
Pre-Update checks ....................................................................................................................... 318
Configuring InsightStaging ........................................................................................................... 319
Creating new UpdateOrder entries ................................................................................................ 322
Disabling the UpdateOrder entries for old v_source view ................................................................ 322
Configuring InsightWarehouse ...................................................................................................... 323
Executing Analytics ETL ............................................................................................................... 325
Post-Update Checks ..................................................................................................................... 326
Adding a New Column from a business rule in Data Manager to the Data Warehouse with SQL Script
(Dim Column) .............................................................................................................................. 327

Page 7 | 335
Advanced Analytics Platform Technical Guide

Pre-Update checks ....................................................................................................................... 327


Create Data Manager Business Rule .............................................................................................. 328
Configuring InsightWarehouse ...................................................................................................... 332
Executing Analytics ETL ............................................................................................................... 334
Post-Update Checks ..................................................................................................................... 334

Page 8 | 335
Advanced Analytics Platform Technical Guide

Disclaimer
THIS IS TEMENOS PROPRIETARY AND CONFIDENTIAL INFORMATION AND SHALL NOT BE DISCLOSED
TO ANY THIRD PARTY WITHOUT TEMENOS’ PRIOR WRITTEN CONSENT.

TEMENOS IS PROVIDING THIS DOCUMENT "AS-IS" AND NO SPECIFIC RESULTS FROM ITS USE ARE
ASSURED OR GUARANTEED. THERE ARE NO WARRANTIES OF ANY KIND, WHETHER EXPRESS OR
IMPLIED, WITH RESPECT TO THIS DOCUMENT, INCLUDING, WITHOUT LIMITATION, ANY IMPLIED
WARRANTIES OR CONDITIONS OF MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE AND
NON-INFRINGEMENT, AND TEMENOS EXPRESSLY DISCLAIMS ANY SUCH WARRANTIES AND
CONDITIONS.

Page 9 | 335
Advanced Analytics Platform Technical Guide

Introduction
Temenos Analytics provides banking specific analytical solutions that improve business decisions, optimize
performance and enrich customer interaction. Financial institutions can transform their organizations to be
analytically driven with pre-built models, KPIs, dashboards, applications and reports, coupled with real-time
data and predictive analytics, allowing them to compete in the digital world. Analytics can be embedded
directly into core banking, channels, CRM and other solutions empowering people to make smarter
decisions and work more efficiently.

Page 10 | 335
Advanced Analytics Platform Technical Guide

Intended Audience
This document is intended for a technical audience and is meant to provide an understanding of how the
Advanced Analytics Platform works, how it is structured, and how to use and configure it.

Users consulting this technical guide will require a working knowledge of MS SQL Server Management
Studio, MS Structured Query Language and T-SQL Script.

Page 11 | 335
Advanced Analytics Platform Technical Guide

Application Overview
The Advanced Analytics Platform is a multi-database business intelligence platform for analytical reporting.
The data warehouse data model is designed specifically for financial institution analytical reporting and ad
hoc data analysis. It supports and integrates multiple data sources and is core banking system agnostic,
meaning it can be configured to work with different core banking systems.

The multiple databases in Advanced Analytics all serve a particular function in the overall data warehousing
process and those features are listed in the table below, together with an overview of the Analytics ETL
(Extract Transform Load) process.

Functional Area Description


InsightImport InsightImport is used to import data from Temenos Core Banking
system. This database handles the import of extracted data from
the DW.Export Core Banking application, transforms that data into
a proper relational format, parses and data profiles data.
InsightLanding InsightLanding is a multi-source archive used to land all source data
together. It holds multiple days of source data and is the only
database other than InsightWarehouse that does so. Raw source
system data can be consumed at this point by reports or other
systems. Any data that may need to be reprocessed into
InsightWarehouse needs to be retained in Landing.
InsightSource InsightSource is used to combine source data from multiple source
systems that may have different import frequencies. It takes source
system data with differing frequencies and makes one combined
copy of the most up to date data from each source system.
InsightStaging InsightStaging is where most of the data mapping and ETL process
takes place. Source views are used to map data from source tables
into Insight data model objects such as Account and Customer. All
of the processes that ultimately transform and load the data into
the InsightWarehouse dimensional model are kept here.
InsightETL InsightETL is used to enhance source system data with business
logic such as additional grouping, segmentations, and
classifications that do not exist in the source data itself, through a
functionality called Rules Management. It has facilities to allow for
easy maintenance and creation of this logic.
Furthermore, through a dedicated stored procedure in this
database, multi-threading can be applied to Analytics ETL
processing.
Unlike all previously described databases, InsightETL does not
contain financial data imported from source systems. This
database, instead, will contain centralized logging tables for the
whole Analytics ETL process, tables to monitor Analytics ETL
Batches, to define and monitor Indexes, to configure Data Quality
services and to boost performance in various views during Analytics
ETL processing.
InsightETL also stores the technical components for the
configuration of Data Quality services and GDPR-compliant erasure
and consent management.

Page 12 | 335
Advanced Analytics Platform Technical Guide

InsightWarehouse InsightWarehouse is where all data to be used for analysis is


ultimately stored in a proper star schema dimensional model. This
is the data source for most out of the box reports for Insight and
the data source for the Insight OLAP Cubes.
Budget Data The Advanced Analytics Platform has a significant amount of
functionality and content for budget reporting but it is important to
highlight that Insight is not a budgeting or forecasting tool. It has
the flexibility to handle the import and reporting of multiple
budgets.
Analytics ETL Analytics ETL is processed by means of SQL stored procedures
Processing and called by SQL agent jobs. The SQL Agent job can be started from
Platform Maintenance any failed steps if needed. Alternatively, ETL can be run by calling
the SQL stored procedures directly with the same parameters as in
the SQL Agent Job. Because of this, the ETL process can be
customized for different scenarios as needed, such as calling
additional custom source systems as part of the job. While the steps
in Analytics ETL execute stored procedures which manipulate data
in all the previously discussed databases, the stored procedure
orchestrating multi-threading and the logging tables are located in
the InsightETL database.
A number of SQL agent jobs are also in place to carry out the most
commonly used maintenance tasks within the platform e.g.
purging, index defragmentation etc.
InsightSystem InsightSystem stores configuration tables and views for the
Analytics web front end application. Configuration data in this
database will be accessed directly from the web application

Page 13 | 335
Advanced Analytics Platform Technical Guide

Application Architecture
Advanced Analytics Data Flow
In Figure 1, we can see the Advanced Analytics Data flow diagram. The starting point for the data flow
within this platform is our set of data sources. As previously explained, Analytics is a multi-source framework
compatible with a variety of source systems. These source systems feed data to the Advanced Analytics
Platform through the Analytics ETL process.

In the example below, Temenos T24 Core Banking represents the main banking source system –data is
normally extracted from Core Banking during the close of business (even though the extraction job can
also run as a standalone service) and saved into CSV files through an application called DW.EXPORT, which
is part of the Data Warehouse module. From R18, DW.EXPORT can also run in online mode, ensuring that
the Analytics Platform is updated in a near real-time manner.

Core banking data is then imported to the Advanced Analytics Platform using Data ExStore. Data ExStore
is a cost-effective and simple-to-implement solution for banks to extract Core banking data and transform
it into a relational database format. Data ExStore consists of a dedicated SQL database, called
InsightImport, and a set of stored procedures which will load Core banking data, look after data typing,
data parsing, data profiling and finally import new data to InsightLanding, the entry point for any source
system in the platform. If near real-time updates are enabled, InsightImport will contain two separated
sets of tables i.e. a relational replica of the latest set of CSV files, extracted by DW.EXPORT from Temenos
Core Banking in a batch mode, and a set of temporary online files storing intra-day updates.

InsightLanding is the platform’s Operational Data Store and archives source data in source format
potentially for any ETL date ever run. Thanks to a series of views it is possible to query InsightLanding for
data archived during different ETL dates. InsightLanding is quite an important source of information that
can directly provide data sets for reports. From R19, InsightLanding will also store a copy of the online
tables in InsightImport that can be used for intra-day reporting.

The integration of incoming data from multiple source systems is done in the InsightSource database,
which contains only one day of data collected from InsightLanding. If any of the optional products are
available, InsightSource also gathers data from Customer Profitability and Predictive Analytics. These
optional modules can be used – during this stage of Analytics ETL – to perform additional calculations which
are then written back to InsightSource – e.g., Customer Profitability will calculate a parameter called Net
Income, which indicates what is the profitability of a specific customer over a certain time horizon (i.e. on
a Monthly or Yearly basis). Predictive Analytics, instead, can provide calculations on other more complex
predictive parameters like attrition risk or lifetime customer value. In addition to this, InsightSource can
even combine transactions from multiple or different dates. For instance, you may have a banking source
system from which we extract data on a daily basis and a Budget source system whose data is only made
available once a year. In InsightSource these two source systems will be integrated even though the data
coming from the two bears different extraction dates. This database still stores data in the source system
format and the actual transformation of this data from sources to target tables, which is the very core of
the ETL process, is performed within the InsightStaging database.

Staging extracts data from InsightSource and transforms it according to a series of source views, which
define how to map source data into the target tables stored into the InsightWarehouse database, our
platform’s data vault. Also, during ETL, InsightStaging applies on the data the business rules defined within
the InsightETL database – these business data rules allow, for instance, to group source data into new
categories, carry out calculations to populate new columns and create new tables altogether in the
InsightStaging, Source or Landing databases. Business rules can also alter the structure of abstraction
views in the InsightWarehouse database. Business rules can be designed and amended in InsightETL
directly using the Analytics Web Application.

Page 14 | 335
Advanced Analytics Platform Technical Guide

Once this process is completed, the transformed data is loaded in InsightWarehouse and, finally, from
InsightWarehouse to the Analytics Cubes or, in some cases, the SSAS Tabular Models. InsightWarehouse,
InsightLanding, and the Cubes will be the core sources of information for the quick reports, custom reports,
analyses reports, Power BI reports, KPI dashboards and Information Tiles accessible from the Presentation
Layer.

Some of the databases shown in the Dataflow Diagram below are important for the configuration and the
execution of ETL but are not used as data source for reports and other Analytics Front End contents. To
start with, the aforementioned InsightETL database stores performance booster features (e.g. multi-
threading), data quality services configuration and logging for the Analytics ETL process in addition to the
already discussed Rules Design facilities. Also, the InsightSystem database hosts configuration and
parameter tables used by the Analytics Web Application – and in the same way, InsightPricing and
PredictivePlatform databases will as well contain configuration and parameter tables used by the Customer
Profitability and Predictive Analytics modules, respectively.

Figure 1 - Advanced Analytics Data Flow Diagram

Page 15 | 335
Advanced Analytics Platform Technical Guide

Data ExStore
Data ExStore is a cost-effective and simple-to-implement solution for banks to extract and parse Core
banking data and transform it into a relational database format. Data ExStore is a packaged solution for
banks to extract, to transform and to store their Temenos core banking data into a relational database
format.

Figure 2 - Data ExStore overview

Data ExStore is based on Microsoft SQL Server and works with Core Banking/T24 R08 and above, on any
platform – Oracle, Microsoft SQLServer, DB2, & jBase. It provides any Temenos Core banking client with
an easy way to extract data from Core Banking for reporting using Temenos Analytics, SQL Server Reporting
Services or any other reporting tool compatible with SQL Server.

As shown in the figure above, Data ExStore consists of three components –

 DW.EXPORT
DW Export is a Temenos Core banking application which exports data from Core banking into
CSV/text files that can be consumed by downstream Analytics processes. The data extracts are
not properly data typed or parsed but it is all in text format (dates, balances, etc.). DW.Export
is executed as part of the overnight close of business process to perform bulk data extraction.
This batch process can be also executed in an online mode for intra-day updates.

Out-of-the-box, DW.EXPORT is configured to extract a set of tables from Temenos core banking.
If any additional table is required by the client, DW.Export applications can be configured to
include those tables.

• InsightImport (Temenos Analytics Parser)


Analytics contains SQL Server processes that consume the text files and uses Temenos Core
banking STANDARD.SELECTION table data to parse and data type the text data and load that
data into a SQL Server schema (STANDARD.SELECTION is the application storing Core banking’s
dictionary). This process is automatic and responds automatically to any Core banking data which
is exported using DW Export. Data dictionary information imported from Core Banking is tested
against actual data and, if proven inaccurate, InsightImport will run its own data profiling process
to overwrite Core banking’s metadata.

The extracted DW.Export text files still contain complex Core Banking data with multi-value and
local reference fields. This process automatically parses the data into simple, readable format
which can be used by any downstream bank process or reporting engine.

Page 16 | 335
Advanced Analytics Platform Technical Guide

Out-of-the-box, InsightImport (Temenos Analytics Parser) is configured to process a set of tables


from Temenos Core banking. If any additional table is required by the client, the InsightImport
database can be configured to parse those additional tables. InsightImport will be discussed in
detail in the next chapter.

The cleansed batch data from Temenos Core Banking will be loaded in a set of SQL Server tables
with a “dbo” schema e.g. dbo.CUSTOMER. If online processing is enabled, in addition to these
standard “dbo” tables, InsightImport can also store a set of tables with “Online” schema that
store intra-day updates in quasi-real time e.g. Online.CUSTOMER. Database administrators can
define whether individual tables should be subject to online processing or not through the
SourceSchema configuration table (that will be illustrated in the InsightImport chapter).

If online processing is enabled for a table, first data load will always be in batch mode with data
being extracted from CSV files, data typed, data profiled, subjected to data quality and multi-
value, sub-value and local reference parsing then loaded to “dbo” tables as in standard batch
processing. Once the first load is completed, the online processing will pull data from the source
system after a set time interval and populate the “Online” schema tables with any intra-day
updates. New entries in the “Online” tables will be then subjected to data quality and copied to
the standard “dbo” tables. Finally, the whole content of the updated “dbo” table will be subject
to multi-value and sub-value parsing and copied to InsightLanding.

• InsightLanding
InsightLanding is effectively a relational copy of Temenos Core banking in a SQL Server format
and can be used for reporting purposes. By default, tables in this database are updated in batch
mode when Analytics ETL runs. InsightLanding will automatically contain all data from Temenos
Core banking which is extracted and parsed using InsightImport. This includes all module data
AA, LD, etc. which have been extracted. These copies of data are kept in the same physical
database and separated by source table name with source system schema. E.g. if the BS
(Banking System) source system has a CUSTOMER table, there will be one BS.CUSTOMER table
in InsightLanding that stores a copy of all CUSTOMER records loaded into InsightLanding for
each business date processed during Analytics ETL. By default, all days of data are stored in one
individual table in Columnstore format and users can filter records loaded on a specific date using
the MIS_DATE column as a filter1.

As in InsightImport, the database administrator can enable online extraction for specific tables
also in InsightLanding. This is done through an online flag that can be enabled or disabled in the
ExtractList configuration table (as explained in the InsightLanding chapter, this table will define
the extraction features for each and every table stored in InsightLanding). If online processing
is enabled for a table, the first processing will be in batch mode and populate the standard table
with source system-specific schema e.g. BS.CUSTOMER. Any intra-day update thereafter will be
stored in a dedicated table with “Online” schema. E.g. Online.CUSTOMER. It must be noted that,
on the one hand, when online processing is enabled, BS.CUSTOMER in InsightLanding will remain
the same until the next ETL, while dbo.CUSTOMER in InsightImport is constantly updated. On
the other hand, if online processing is enabled for the CUSTOMER table in InsightLanding, the
latest copy of the dbo.CUSTOMER table in InsightImport (that will include intra-day updates),
will be stored in the Online.CUSTOMER table in InsightLanding after a set time interval. The
online processing for InsightImport and InsightLanding is shown in Figure 3.

InsightLanding allows for data archiving and for reporting to be produced on different days of
data and on intra-day data. This archived data can also satisfy different bank audit requirements

1 If a bank wishes to use row-base format tables in InsightLanding, instead, there will be a CUSTOMER
table for each day of data that has been extracted and the business date will be part of the table schema
e.g. 20190101BS.CUSTOMER.
Page 17 | 335
Advanced Analytics Platform Technical Guide

regarding the archiving of data. All data types will be accurate and local reference and multi-
value fields will be parsed. This point-in-time data extract that be used for reporting, building
extracts, or building interfaces to a bank’s data warehouse. Reports are built upon two types of
InsightLanding views i.e. batch views with source system-related schemas (e.g. BS.v_Customer)
and online views with online schemas (e.g. Online.v_Customer). While the former are based only
on batch InsightLanding tables such as BS.CUSTOMER, the latter will query a full set of all Extract
List tables (including Batch only) and will be a union of online deltas and previous business date
batch. More information will be provided in a dedicated InsightLanding chapter.

Figure 3 - Online processing in InsightImport and InsightLanding

Data ExStore does not include Business Content i.e. it does not contain any dashboards, BI applications,
and OLAP cubes; it is simply a data extraction (and transformation) solution with reporting based on Core
Banking tables in a relational format. Business Contents will be available in the Analytics Web Application
and its corresponding InsightSystem database and within the SSAS Analytics Cubes or Tabular servers.

Also, Data ExStore does not include a facility to extract data from additional Data Sources – Data ExStore
only works with Core banking/T24 data (R08 and above). Back-patches can be made with R06 and above
but must be reviewed by the Insight Product team.

Core Analytics ETL Flow


As explained in the previous section, until the data flow reaches InsightSource all imported data is still
complying with the source system’s original data model, albeit being in a relational format and having being
transformed by a set of Data Manager rules. The core Analytics Extract Transform Load process takes place
in in the InsightStaging database and, in the figure below, we will see in detail how this happens.

Page 18 | 335
Advanced Analytics Platform Technical Guide

Figure 4 - Core Analytics ETL Flow

The core Analytics ETL process is controlled by a stored procedure called


InsightStaging..s_InsightStaging_Update, which is executed as a step of the Analytics ETL SQL
Server Agent job. An Extract, Load and Transform stored procedure exists for each dimension and fact
table in the InsightWarehouse – these children stored procedure are internally called by the aforementioned
parent stored procedure called InsightStaging..s_InsightStaging_Update. Children stored procedures
(steps) can in turn internally invoke other third level stored procedures (sub-steps) – the order in which
these steps and sub-steps are executed and the enforcement of any dependency between them is
controlled through the UpdateOrder table in InsightStaging.

High-level progress and timing of Analytics ETL are logged in the StagingEventLog table while more
detailed information is available in the StagingEventLogDetails table, both of which are located in the
InsightETL database.

Extract
The core extraction phase takes place when data is imported from the InsightSource database to
InsightStaging. Data coming from source systems’ columns is here mapped against the Data warehouse
columns through the so-called v_source views. V_source views can join multiple tables from
InsightSource, rename their columns and apply simple transformations e.g. CONCAT.

Page 19 | 335
Advanced Analytics Platform Technical Guide

Different source systems in InsightSource will have separated sets of v_source views. However, separated
tables from the same source system can be combined into a single table. The outcome of the mapping
performed by v_source views will be located in temporary tables called source tables.

For example, the v_sourceAccountBS_AALending will use the AA_ARRANGEMENT table from InsightSource
as its main source table, but this will be joined as well with ACCOUNT, DATE, COMPANY, AA_PRODUCT
etc. and columns from all these joined tables can be extracted, relabeled and used.

The data extracted from InsightSource view will be materialized, within the InsightStaging database, in the
sourceAccountBS_AALending temporary table based on the content of v_sourceAccountBS_AALending. The
data types of the columns materialized in the “source” tables in InsightStaging and any additional column
that is not present in the v_source view will be read from the DataDictionary table in InsightWarehouse
(see Load section of this chapter).

From a technical point of view, data is materialized in a source temporary table when the corresponding s_
* Extract stored procedure is executed (e.g. s_FactAccount_Extract, s_DimAccount_Extract etc.). As
previously mentioned, each Dim or Fact table in the Warehouse will have their corresponding
s_Fact*_Extract or s_Dim*_Extract stored procedure. This same stored procedure will also read Data
Dictionary to obtain the correct data type for each column in the “source” table and apply the conversion.

In addition to this, InsightStaging..s_InsightStaging_Update will also internally call the


s_CreateRuleGroup stored procedure, that will execute all InsightStaging business rules with execution
phase set to “Extract”.

Transform
In the Transform phase, more calculations will be applied to the data materialized in the temporary “source”
tables of InsightStaging – these calculations are defined through Data Manager’s business rules for the
InsightStaging database and with execution phase set to “Transform”.

In addition to this, the Transform phase of Analytics ETL also extract any required values from the
databases of any optional product installed and applies them to InsightStaging tables – for example, here
is when Customer and Account Monthly and Yearly Net Income are imported in InsightStaging if the
Customer Profitability product is installed; likewise, data can be imported from the databases of the
Predictive Analytics module, from Enterprise Risk Management etc.

Then, the transformed data from the “source” tables is copied to a set of “staging” temporary tables and
the corresponding Dim and Fact tables are created but left empty (e.g. the modified content of
sourceAccount is moved to staging Account and empty DimAccount and Fact Account tables are created
during the Transform phase). These temporary tables will be then populated and finally loaded into the
InsightWarehouse database during the Load phase.

Technically, an s_*Transform stored procedure for each Dim or Fact table exists and will execute what
described above (e.g. s_FactAccount_Transform, s_DimAccount_Transform etc.).

Load
The list of dimension and fact columns to be loaded in the InsightWarehouse database in controlled by the
DataDictionary table (and by the corresponding v_DataDictionary view) residing in this database. This table
also defines whether each dimension should be treated as type 1 or type 2.

The only columns whose content is not defined through mapping or calculations based on InsightStaging
data are surrogate keys (e.g. CustomerId, AccountId etc.) which are added during the Load part of ETL.
Surrogate keys are written directly in InsightWarehouse while the remaining columns are copied from the
temporary Dim and Fact tables in InsightStaging to corresponding Dim and Fact tables in

Page 20 | 335
Advanced Analytics Platform Technical Guide

InsightWarehouse. If an error in encountered during load, the current transaction (per table) is rolled back,
the Analytics ETL process is interrupted and it reports an error.

Users will only be able to see the newly loaded data in reports once the process has completed successfully
for each and every table and the new business dated is enabled, at the end of the Analytics ETL process.

The whole process described above is executed by a set of sub-steps called s_*Load – yet again a stored
procedure will exist for each and every Dim or Fact table (e.g. s_FactAccount_Load, s_DimAccount_Load
etc.).

In addition to this, InsightStaging..s_InsightStaging_Update will also internally call the


s_CreateRuleGroup stored procedure, that will execute all InsightStaging business rules with execution
phase set to “Load”, if any.

Analytics in Azure PaaS


Azure Platform as a service (PaaS) is a complete development and deployment environment in the cloud,
with resources that enable you to deliver everything from simple cloud-based apps to sophisticated, cloud-
enabled enterprise applications 2. Banks using PaaS can purchase the resources they need from a cloud
service provider on a pay-as-you-go basis and access them over a secure Internet connection.
PaaS is designed to support the complete web application lifecycle and it includes infrastructure—servers,
storage, and networking—but also middleware, development tools, business intelligence (BI) services,
database management systems, and more. This cloud environment allows banks to avoid buying and
managing software licenses, the underlying application infrastructure and middleware or the development
tools and other resources.
From R19, all components of the Advanced Analytics Platform will be available in cloud native Azure PaaS
– a sample of PaaS-based architecture is shown in Figure 5.
The Data ExStore component in Azure is designed to provide a SQL Azure DB loaded with fully parsed and
data typed Temenos Core banking data. Core Banking extracts are sent to an Azure compute VM that holds
two traditional SQL Server databases. This VM is only needed for processing and is only brought online
when a new extract is ready for processing.
The extract is processed using the same methodology as the on premises version of Analytics Data ExStore.
When processing is complete then the updated delta from the ODS is transferred to a SQL Azure DB where
the core ETL processing is carried out and the result is loaded into the Analytics Data Warehouse. Then,
once the copy is complete and verified, the Azure VM is taken offline to leverage the elasticity of the cloud
and keep cost down.
The SQL Azure DB is always online and due to the nature of Azure SQL is highly available and fully
recoverable by design. In addition, for smaller institutions or for institutions only interested in just recent
data the SQL Azure solution can be run on a standard subscription, further reducing costs. For larger banks
that will require substantial data volumes column store will be used and a premium subscription will be
needed.
The Analytics web front end application will be hosted in Azure Websites and the application back end will
be hosted in Azure DB allowing for utilization of Azure scalability and availability. Reporting options will
include interactive pivot reports on relational datasets and on Azure Analysis Services tabular models,
graphical information tiles, and Power BI reports. Graphical information tiles will be embeddable though
our Embedded Analytics functionality.

2 https://azure.microsoft.com/en-us/overview/what-is-paas/
Page 21 | 335
Advanced Analytics Platform Technical Guide

Figure 5 – Analytics in Azure PaaS

When PaaS is used, the Advanced Analytics platforms will preserve their full features set and code re-work
will be minimized. The advantages of this kind of deployment for the bank include more elastic scalability,
the possibility to leverage any built in high availability and disaster recovery functionalities and the ability
to no longer maintain support infrastructure for IT infrastructures.

Page 22 | 335
Advanced Analytics Platform Technical Guide

InsightImport
Overview
Advanced Analytics has a rich data warehouse which accepts data from multiple data sources and is
architected to be core banking system agnostic. However, as explained in the previous chapter, there is a
specific functionality built into it to deal explicitly with importing and parsing data from the Temenos core
banking system. This functionality exists in the InsightImport database and this database and the
procedures within it are only relevant for importing data from Temenos Core banking and not from other
core banking systems.

As previously mentioned, Temenos Core banking can export data using an Analytics-specific application
called DW.Export which exports that data to CSV files (batch export) or directly to the Analytics Platform
(online export). When CSV are used, a log of all the exported CSV files is provided in a CSV file called
HASH_TOTAL. See the DW.Export User Guide for a detailed explanation of this application and its features.
This CSV data coming from DW.Export is still in the Temenos Core banking data structure, meaning it still
has multi-values and sub-values.

The InsightLanding database requires data in a relational database format with properly typed data.
InsightImport bridges that gap by importing the CSV files, data typing each column, and parsing the local
reference, multi-value, and sub-value columns (for more information and examples about how multi-values,
sub-values and local reference fields are parsed in InsightImport and for more detailed information about
the R19 enhancements to this feature, please refer to the Additional Information section of this chapter).
Sub-table parsing relies on fine granularity in multithreading at the row level that improves efficiency.
Furthermore, the sub-table parser has been rewritten to work with Data Quality to multithread rows in
memory. The end result is to create a properly data-typed relational format for import into InsightLanding.

ETL batch control, multithreading configuration and all logging concerning data processing in this database
is stored in dedicated tables within the InsightETL database.

Page 23 | 335
Advanced Analytics Platform Technical Guide

Specific Features / Functions

Feature Description
Import Data from CSV The import procedure connects to CSV files which are exported by
DW.Export and imports those files into the InsightImport database.

Data Typing Data from the CSV files of DW.Export is all considered text by
default. InsightImport provides a way to automatically type the data
being imported to be dates, integers, etc. in order to have efficient
data storage.
Data Parsing Temenos Core banking is a highly flexible system which allows
entire tables to be stored within a single column of a row from the
parent table. These are Local Reference, Multi-Value and Sub-Value
fields. InsightImport provides a mechanism to parse these cells out
into additional tables which are joined to their parent table on a
unique key.
Data Profiling A data profile is created which checks if the source system dictated
data types are correct and records a newly calculated data type if
the source data type is not adequate. The data profile includes Min
Value, Max Value, Length, the string with the maximum length, Not
Null, Is Numeric, Is Date etc.
User Data Type Tables created in InsightImport are created from a Data Dictionary.
Override The user can override a source system data type with another
datatype if required.

Object Renaming InsightImport allows you to rename the objects being imported at
the table and column level. If a table has been renamed in Temenos
Core banking but its structure hasn’t changed, this feature can be
used to rename the table so the mapping further downstream in
InsightStaging does not have to be modified and can retain logic
using the original table name. This is quite useful for CRF report
extracts which are frequently renamed.
Skip Columns InsightImport allows you to skip importing certain columns as they
may not be required. This allows you to store less data or ignore
columns that you know to contain bad data.

Data Quality Service InsightImport allows you to ensure that the Analytics ETL and
Process Data ExStore job do not fail if the import process encounters
non-critical bad data

GDPR Compliance InsightImport allows you to import from Temenos Core Banking the
metadata required to manage consent and data erasure according
to GDPR legislation

Page 24 | 335
Advanced Analytics Platform Technical Guide

Technical Details
Architecture
InsightImport is the second step in a Temenos Core banking implementation of the Advanced Analytics
platform, with the first step being DW.EXPORT extraction. It accepts the CSV files exported from Core
banking or online updates, imports them into a SQL database, assigns data types and parses the data into
a relation format to be consumed by InsightLanding.

Figure 6 - InsightImport in the Core Analytics ETL Flow

Technical Components
Because each Temenos Core banking implementation will have different modules and local customizations,
the tables sent to the Analytics platform may also be different between installations. There may be
additional tables and/or additional columns required. A combination of locally configured values and data

Page 25 | 335
Advanced Analytics Platform Technical Guide

from DW.Export imported tables will let the process know which tables to import and which columns are
local-refs or multi-value.

For example, the SourceSchema table is loaded with the relevant table names to import, and it is
configured manually via the Local TFS layer. SourceSchema is an Analytics-specific configuration table
which contains the list of tables that should be created in InsightImport as replicas of the CSV files imported
from Core banking. This table also includes a set of flags that identify whether ETL should parse local
reference, multi-value or sub-value fields within an imported Core Banking table (i.e. T24ExtractLocalRef,
T24MultiValueAssociation and T24SubValueAssociation, respectively). SourceSchema also includes flags
that specify whether DW Online Process and Online Process should be executed on a table or not.

The first time the Analytics ETL process runs, it will generate a data model consisting of a table
(Insight.Entities) containing all the tables to be created including sub-tables (Local Refs, multi-values
etc.) and a table (Insight.Attributes) which contains all the columns to be created.

The Insight.Entities and Insight.Attributes tables are populated from the following metadata sources
– SourceSchema table, Standard_Selection table Insight.SourceColumns table (which is populated
from the top line of each CSV file), Local_Table table and Local_Ref_Table.

S_ImportBaseTables and s_ImportSubTables are the stored procedures that control the processes
to create tables from CSV and from LocalRef/Multi-value/Sub-Values. They are supported by a number of
helper stored procedures that will be discussed as we proceed.

The general idea behind the process is that on first load the data is loaded into tables created with
nvarchar(Max) columns, the data is then data profiled and the correct data type is determined – this process
is called Data Profiler Import (DPI). Once the data profiling process is completed, data tables are created
with the correct data type and then loaded with data. Furthermore, an entry for each column of each data
table imported from Core banking and the associated data types is stored in the Insight.ImportSchema
table.

On subsequent runs, the data profiling import process will not be repeated for existing tables and these
tables will be loaded in InsightImport making use of the data profile information defined in
Insight.ImportSchema during the first Analytics ETL and only importing the content those columns that
were previously profiled, ignoring any new columns.

If nothing changes in the metadata, this speeds up the process and minimizes the risk to Analytics ETL
failure as the data is loaded directly into the existing typed table and data profiling is not done again. If
something does change during subsequent ETL processes, like a column size in Standard_Selection, we
will need to manually edit the appropriate configuration table in InsightImport to ensure that the process
re-runs the data profiling step and the required tables are recreated.To be specific, if a new table is created,
we will need to enter a definition for it in Source Schema. If new columns are added to an existing table,
instead, all column entries related to the existing table will have to be deleted from Insight.ImportSchema–
this will trigger data profiling to be re-done for all the columns in the table (including the new one) during
the next Analytics ETL is execution.

Data Quality
From R18, a new Data Quality Import (also known as DQI) has been included in the Data ExStrore stage
of Analytics ETL. The main design goal for DQI is to achieve a relatively stable table structure for import.
Having such stability, we now have chance to detect the corrupted data and safely bring daily data with
expected schema into InsightLanding with Columnstore Index technology.

Page 26 | 335
Advanced Analytics Platform Technical Guide

This Import stage, in fact, could be interrupted due to some corrupted fields in the Temenos Core Banking
CSV file or to any unexpected new text suddenly not being compatible anymore with the strongly-typed
target column. Often, it is a trivial column that terminates the whole ETL, even though an immediately
accurate value for this column is not urgently required and a correction can be applied later, provided of
course that this bad data can be spotted and cleansed during the importing phase then red-flagged to all
downstream consumers. This is the main reason we need a Data Quality component.

Secondly, the column schemas of the columnstore indexes in the new ODS should be kept relatively stable
and not subject to frequent adjustments – this feature will be discussed in the chapter dedicated to
InsightLanding. Maintaining a set of fixed column data types as long as possible in the ODS tables is key
to achieving both the super high column-wise data compression and optimal performance. It is now
necessary to have a regulator on data quality to constrain the data types casted for the CSV fields so the
source CSVs always conform to the existing target SQL tables in order to avoid the penalty of randomly
rebuilding huge archival tables.

In general, in the event of encountering corrupted values, instead of terminating the ETL process, the show
now goes on and the bad data will be dealt with later. But before that, the Data Quality Component marks
the affected area, repairs them according to predefined rules, and provides all other subsequent handlings
with the necessary logging. Configuration tables for parameterizing DQI repairs and logging are stored in
the InsightETL database and will be discussed in a dedicated chapter in this document.

In the following image we can see a diagram of the new data import process flow with Data Quality
Services.

Figure 7 - New data import process flow with Data Quality Services

Page 27 | 335
Advanced Analytics Platform Technical Guide

Data Quality and Data Quality System columns in InsightImport


As previously mentioned, the smallest operation unit or grain of the Data Quality Import is the column
level. Once configured, all tables’ columns in InsightImport are subject to data quality checks and, if
necessary, in-place correction will be made to columns failing these checks according to data quality rules.

Furthermore, when data quality import is executed, three new columns will be added to each and every
InsightImport base or ‘sub’ table affected (e.g. BNK_CUSTMER, BNK_ACCOUNT, BNK_ACCOUNT_LocalRef
etc.) to host detailed data quality results about the row processed – these new DQ-related columns are
ETL_DQ_RevisionCount, ETL_DQ_ColumnsRevised and ETL_DQ_ErrorMessage.

These ETL_DQ system columns are permanent and, if any data quality issue is encountered in a table thet
will be added to it starting from the second execution of the Analytics ETL when Data Quality becomes in
effect (as previously illustrated, the very first ETL execution uses Data Profiler only). Therefore, not only
users do not need to manually create these system columns to start with but also they should avoid
modifying them – even the position in the table of data quality system columns should not be changed. 3
The new columns have the following features:

ETL_DQ_RevisionCount (Nullable Int)


This column stores the total number of changes made by DQ to fields on the current row. This column is
initially missing from InsightImport data tables but it will be internally added to each and every one of them
by Analytics ETL, prior to the DQI checks.

A Null value in this attribute means the row did not get DQI checked since the T-SQL bulk insert statement
has successfully imported it. The value ‘0’ means that DQI checked the row and no issue found. Any positive
integer indicates number of DQI issues encountered – the details about these issues will be further
illustrated in other two data qualitity system fields.

ETL_DQ_ColumnsRevised (Nullable NVarChar(4000))


Columns that have been corrected during DQI will have their names listed in this system field, delimited by
a comma sign. If ETL_DQL_ColumnsRevised remains null, it means that no issues were found. Like,
ETL_DQ_RevisionCount, this column is initially missing from InsightImport data tables but it will be
internally added to each and every one of them by Analytics ETL, prior to the DQI checks.

ETL_DQ_ErrorMessage (Nullable NVarChar(4000))


If any issue is encountered on a row during DQI, the detailed error-message-going-to-be is captured from
in-memory SQL Server testing structure and all the error messages are concatenated together in the
ETL_DQ_ErrorMessage column for all the raw values found to be problematic. This information is written
in JSON-array format.

User can either browse the resulting JSON string in a plain view or utilize the fn_GetJsonErrMsg function in
InsightETL to output as table style. In the rare scenario in which the total length of the original JSON string

3 The DQI process makes sure they are always appended at rightmost of each table to achieve optimized
ordinal mapping. This guarantees the best processing performance and changes to this initial setting are
therefore strongly discouraged.
Page 28 | 335
Advanced Analytics Platform Technical Guide

exceeds 4000 characters, DQI will truncate it at 4000 characters and replace the trailing characters with 3
dots. An example of palin view is provided in the image below.

Figure 8 - Example of JSON string in plain view

Data Quality Configuration tables, store procedures and functions in InsightImport


A few changes to the technical components of InsightImport have been applied to accommodate the new
DQI process. To start, the new Insight.ImportSchema table has been created to store information about
column schema and the final data type for each column value considered in DQI. More details about this
table will be provided in a dedicated section of this chapter.

To further adjust DQI’s behavior, the ‘DQAlwaysOn’ flag has been added to SouceSchema – as we will see
in the dedicated section, this flag can allow Analyitics ETL to skip the Data Profiler Import and immediately
appli DQI from the first execution.

In addition to this, stored procedures and functions in charge of processing Temenos Core Banking data in
InsightImport have been rewritten or added ex novo to accommodate the new DQI process. New DQI-
related procedures and functions have also been created into the InsightETL database. Technically, the
core programming of DQI is implemented as a standard C# DLL library, wrapped within a SQL-CLR
assembly and called by T-SQL stored procedures. The core DLL executes DQ logic in a paralleling high-
speed streaming fashion, also utilizes multithreaded firehose-read and bulk-write from start to finish. In
general, memory consumption is low and stable while CPU and disk utilization is high and efficient.

DQI Rules
All the Data Quality used to define whether data table’ columns are correctly data typed and, hence, to
orchestrate the DQI process are stored in the RuleDataQuality table of InsightETL. More information about
this table will be provided in the dedicated chapter and section.

Page 29 | 335
Advanced Analytics Platform Technical Guide

DLL Assembly
In R18 initial release, the two DLLs required for DQI are built on .NET Framework 4.5.1, targeting SQL
Server 2016. Additional steps of code signing with digital certificate will be required for installation on SQL
Server 2017. These DDLs are listed below:

 Temenos.Analytics.DataQuality: standard C# class library.

 InsightETL: SQL-CLR assembly. It depends on the one above.


Both the DLLs above require UNSAFE permission level since CSV file access and multithreading are in use.

Old process
The old import process until Release 2014 relied on the Source Schema configuration table, and also on
the following SQL Stored Procedures:

 s_InsightImportSystemTables_Update

 s_InsightImport_Update
 s_T24TableFromSelection_Create

 s_T24SourceSchema_Update

These stored procedures are now completely deprecated in R17 but were still available for backward
compatibility until Release 2016.

Tables
SourceSchema
The table InsightImport..SourceSchema contains a list of all files extracted from Temenos Core banking
needed for the Analytics ETL process and should be configured before the first Analytics ETL is executed.
Records in the SourceSchema table are used to control the setup of data files that need additional
processing (i.e. parsing multi-valued or sub-valued columns and data type conversion).

There should be a record for each table being imported from Core banking. Please note that there may be
more than one CSV file per Core banking table being imported but you still only need one record per source
system table, not per CSV file.

From R19, SourceSchema includes a flag that allows the database administrator to configure a table for
online processing. It should be noted that any table that has an API needs to be configured for batch
extraction only (with the exception T24 metadata tables such as 'LOCAL_REF_TABLE', 'LOCAL_TABLE' and
'STANDARD_SELECTION').

The list of table names in the SourceSchema table should match the list of table names in the
InsightLanding..ExtractList table. Below is a description of each column in the SourceSchema table
and how it should be populated for each record.

Column Name Description


SourceSchemaId Record Id (identity column). Populated automatically.
TableName Table name
SchemaName Should always be BS (Banking System)
Page 30 | 335
Advanced Analytics Platform Technical Guide

Enabled_ This column is used in connection with the Configuration


column. Acceptable values for Enable_ are 1 (which means
Yes), 0 (which means No) or NULL (also means NO).
If a SourceSchema row has the Enabled_ flag set to 1, the
Core banking table definition defined in the row will be
taken into consideration during the Analytics ETL process,
otherwise, it will be ignored. Enabled_ is used to both
exclude redundant Core banking tables from being loaded
into InsightImport and to disable obsolete table definitions
which should not be erased or overwritten. E.g. let us
consider the case in which a client wants to modify the out-
of-the-box SourceSchema for a certain table, e.g.
ACCOUNT. The client must mark as disabled (i.e. set the
Enabled_ column to 0) the ACCOUNT record with
Configuration set to ‘Modelbank’. Then, they can create a
new SourceSchema row, also defining how to extract the
ACCOUNT table, but with the Enabled_ flag set to 1 and
with Configuration column set to ‘Local’.
RenameColumns Used to rename primary IDs in Core banking files. This is
done to stay consistent with the mapping already in place
in the InsightStaging database for example.
Syntax: CurrentName:NewName (i.e. LIMIT_ID:@ID)
PrimaryIndexColumns Used to create a primary index on columns defined.
Syntax: column1,column2,column3…
(i.e. @id)
RebuildTableFromCode Rebuild Table From Code if there is no Standard Selection.
Not currently in use, for future development only.
T24RebuildFromSelectionIfMissing If there is no data in a T24/Core banking table, then no
CSV file will be created for this table. Consequently, the
table will not be imported from Core banking and this will
cause an issue with Analytics ETL which expects this table
to be present. Setting this option to yes will rebuild the
table from the definition in Standard Selection in the event
that it is missing. (i.e. should be always set to Yes)
T24SelectionName Name of the actual Core banking application in the original
format (e.g. ACCOUNT.CREDIT.INT)
T24LRSVSelectionName Not currently in use, for future development only.
T24SkipDataTypeConversionForColumns Used to skip data type conversion on tables
Syntax: column1, column2,…
(e.g. CCS.COLL.MT.NUM,NO.OF.EMPLOYEES)
T24ExtractLocalRef Used to parse LocalRef columns into a new separate table.
LocalRef columns for the customer table will be parsed into
the BNK_CUSTOMER_LocalRef table as an example. Values
can be Yes, No or NULL.
(i.e. this column value is set to Yes if we want to parse
LocalRef columns in a table and to No/NULL if we do not
want to parse them)

Page 31 | 335
Advanced Analytics Platform Technical Guide

T24SkipColumnsInExtractLocalRef Used to skip data parsing on specific LocalRef columns that


are parsed into the new local ref table even when
T24ExtractLocalRef is set to Yes.
Syntax: column1,column2,…
(e.g. CCS.COLL.MT.NUM,NO.OF.EMPLOYEES)
T24ExtractSubvalueFromLocalRef Used to parse LocalRef columns that contain Sub-Values. If
LocalRef columns have sub-values in addition to multi-
values then setting this to yes will parse those out into an
additional table. Values can be Yes, No or NULL.
(i.e. this column value is set to Yes if we want to parse Sub-
values in LocalRef columns for the current table and to
No/NULL if we do not want to parse them)
T24MultivalueAssociation In T24/Core Banking applications, there are multi-valued
sets of fields that are associated. This column is used to
indicate whether these fields need to be parsed or not.
Valid entries are ‘Yes’ or null. If the value of this field is set
to ‘Yes and the Insight.s_PopulateMVTable.sql script
available under InsightImport database is run, then ETL will
copy all the MV columns from the associated SourceSchema
table to the T24MultiValueSubValueTables table. The user
will then review and enable the required MV columns by
setting Active=1 in that table.
T24SubvalueAssociation Used to indicate whether sub-valued values within multi-
valued fields should be parsed or not. Valid entries are ‘Yes’
or null. If the value of this field is set to ‘Yes and the
Insight.s_PopulateMVTable.sql script available under
InsightImport database is run, then ETL will copy all the
MVSV columns from the associated SourceSchema table to
the T24MultiValueSubValueTables table. The user will then
review and enable the required MVSV columns by setting
Active=1 in that table.
DQAlwaysOn This parameter drives the behaviour of Data Quality Import
in Analytics ETL. Acceptable values are 1 (On) or 0/Null
(Off). For base tables, once the ‘DQAlwaysOn’ flag is
switched on, ETL Import will not bother to try BULK-INSERT
tables the first time it runs; instead, it will directly apply DQ
Import. With this flag switched on, the corrupted data can
be identified and revised as early as the very first ETL
Import, even the forbidden nulls can be corrected by DQ in
a primary-key-to-be column.
Starting from the second ETL run, sub table including all
Local-ref (LR), Local-ref-sub-value (LRSV), Multi-value
(MV) and Multi-value-sub-value (MVSV) tables, will be
always guided by DQ, which works seamlessly with the
subtable parser in memory, multithreaded.
Essential system tables such as STANDARD_SELECTION,
LOCAL_REF_TABLE and LOCAL_TABLE are always created

Page 32 | 335
Advanced Analytics Platform Technical Guide

with fixed table structure. Nevertheless, with ‘DQAlwaysOn’


switched on, their column definitions in ImportSchema still
make them get DQI checked all the time. The only table
exempted is HASH_TOTAL, it is so critical that always
created with a pre-defined table structure and never takes
DQ nor DP approach. As a consequence, any corrupted
data in CSV splits of HASH_TOTAL constitutes an
immediate ETL termination.
T24ParseConsolKey Stores information on how to parse consolidation keys for
the GL. If set to Yes, Consolidation keys are being parsed
to boost ETL performances (this column value is only set to
yes for RE_CONSOL_SPEC_ENTRY and CRF-related tables)
Configuration This column is used to make version control in this table
more consistent and easier to manage. Configuration
defines what the source for the current row is in the table
(provided out-of-the-box by Temenos, added later by the
client as a result of local development etc.) – available
values are:
- ModelBank: this entry has been added to satisfy
Temenos core banking mapping and/or business
rules
- Local: the entry is used during the implementation
to update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record
definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics solution
when deployed

DFTableName Data Filter Table Name (for future use).


DataFilterCode Data Filter Code (for future use).
OnlineProcess This is a flag to set whether the table considered is
populated through the batch method (i.e. with data
extracted from Temenos Core Banking during COB and
stored into CSV files) or online (i.e. in a near real-time
manner). Acceptable values are 1 (which means Online), 0
(which means Batch) or NULL (also means Batch).

Page 33 | 335
Advanced Analytics Platform Technical Guide

Insight.SourceBusinessEntities
This table is used to create a list of Core banking companies that CSV files must be imported for and should
be configured before the first Analytics ETL is executed.

Column Name Description


EntityMnemonic E.g. BNK.
EntityType Master or Lead, Master should be first.
Configuration This column is used to make version control in this table
more consistent and easier to manage. Configuration
defines what the source for the current row is in the table
(provided out-of-the-box by Temenos, added later by the
client as a result of local development etc.) – available
values are:
- ModelBank: this entry has been added to satisfy
Temenos core banking mapping and/or business
rules
- Local: the entry is used during the implementation
to update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record
definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics solution
when deployed

Insight.ColumnOverride
This table is used to rename or change the data type of columns and should be configured before the first
Analytics ETL is executed.

Column Name Description


TableName E.g. AA_ARR_ACOUNT
ColumnName E.g. [@ID]
DataType The new data type e.g. Nvarchar(20)
InsightColumnName The new column name
AllowNulls Define if nulls should be allowed and it is necessary for
index creation. Acceptable values are 1 (i.e. Yes), 0 and
NULL (which both mean NO)
Configuration Configuration defines the information source for the current
row – available values are:

Page 34 | 335
Advanced Analytics Platform Technical Guide

- ModelBank: this entry has been added to satisfy


Temenos core banking mapping and/or business
rules
- Local: the entry is used during the implementation
to update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record
definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics solution
when deployed

Insight.SourceColumns
This table is used to store the list of columns in each table imported in InsightImport.

Column Name Description


EntityId This foreign key is the Id of the Insight.Entities entry
associated with this row.
ColumnName Column Name as per defined in the CSV file
ColumnOrder Order of the column as per CSV file
TableId Not currently in use.
ColumnId Not currently in use.

Insight.Attributes
This table stores the data dictionary compiled from Insight.SourceColumns, T24StandardSelection,
SourceSchema, Insight.DataProfiles, Insight.ColumnOverride.

Important Note: Insight.Attributes should not be updated directly, but only by


Analytics ETL based on the content of the above tables. Insight.Attributes’ content is
regenerated based on user input so direct changes will not be preserved.

Column Name Description


AttributeId Identity field for the Insight.Attributes entry
EntityId This foreign key is the Id of the Insight.Entities entry
associated with this row.
Name Original column name from CSV file Eg. BRANCH_CO_MNE
InsightColumnName From Insight.ColumnOverride, will rename an existing
column.
ActualColumnOrder ColumnOrder from Insight.SourceColumns, the order of
columns in the actual CSV’s.
Page 35 | 335
Advanced Analytics Platform Technical Guide

SourceSystemColumnOrder The Order of columns in the Source System data dictionary


(Standard Selection). Note that SourceSystemColumnorder
can have negative values for those columns added by Data
ExStore and which were not present in the original Core
banking table (e.g. MIS_DATE).
ColumnType The Column type in the Source System. Acceptable values
are: Regular (for regular single-value columns), MV (for
Multi-value columns), MVSV (for columns which are sub-
values of multi-value fields), LocalRef (for Local Reference
columns) and LRSV (for columns which are Sub values of
local references).
SourceSystemDataType The DataType in the Source System Data Dictionary
(Standard_Selection for Temenos Core banking). E.g.
[nvarchar](Max)
ActualDataType The Data Type determined by data profiling and stored in
Insight.DataProfiles. E.g. [Nvarchar](50)
InsightDataType The data type from Insight.ColumnOverride eg.
Nvarchar(20)
SingleOrMultivalue The value of this column is calculated by ETL if the field
T24MultiValueAssociation is set to ‘Yes’ in the
SourceSchema definition for the table. SingleOrMultivalue
determines if the column defined in this row is single or
multi-value i.e. value S or value M.
ColumnExistsInSourceTable Defines if the currently described column exists in the CSV
file. Acceptable values are 1 (i.e. Yes), 0 or NULL (i.e. NO)
and they are defined based on the content of the
Insight.SourceColumn table
ActualColumnLength The Column Length determined by data profiling and stored
in InsightImportDataProfile, if different from what defined
in the Source System (see next column)
SourceSystemColumnLength The Column Length in the SourceSystem Data Dictionary
(Standard Selection table in Core Banking).
InsightColumnLength The column length from Insight.ColumnOverride
RebuildFromSelectionifMissing Column taken from SourceSchema (if set to Yes, an empty
table is built in InsightImport based on its Standard
Selection definition when no CSV file is provided for a
certain Core banking table).
ParsingAlias Column Alias internally used by the parsing logic for multi-
value tables. The value of this column should not be
manually edited
ColumnLog Defines if the column was updated (acceptable values:
Insert, Column Changed)
ColumnSource Defines which source table was used to insert or update
the column updated e.g. Standard Selection,
SourceColumns, KeyColumns, ColumnOverride etc.
ColumnID The ID from system column if any
TableID The ID from system tables if any

Page 36 | 335
Advanced Analytics Platform Technical Guide

Active Defines if the column active and should be considered for


Analytics ETL. Acceptable values are 1 (i.e. Yes), 0 and
NULL (which both mean NO)
Added When was the column added
Modified When was the column modified
AllowNulls Defines if the column can contain NULL. Acceptable values
are 1 (i.e. Yes), 0 and NULL (which both mean NO)
PrimaryIndexColumn From Source Schema, defines if the column is a primary
index. Acceptable values are 1 (i.e. Yes), 0 and NULL
(which both mean NO)
RowHash Hashed value of the row

Insight.Entities
This table contains a list of tables to be processed and it is populated by the
s_InsightImportTableList_Create stored procedure 4.

Important Note: Insight.Entities should not be updated directly, but only by


Analytics ETL. Its content is regenerated based on user input so direct changes will
not be preserved.

Column Name Description


EntityID Identity field
Name The table name
ParentEntityId This column, in connection with ChildLevel and TableType,
defines any hierarchical relationship within tables in
Insight.Entities. If a table is dependent on another table,
the ParentEntityId column will store the EntityId of the
parent table. For, example, if the currently defined table
stores the content of multi-value fields of a parent table
with EntityId = 42, the ParentEntityId column will be set to
42.
If the current table is not dependent on any parent table,
the value of this column will be null.
ChildLevel Defines hierarchical level of dependency of the current
table and can be null, 1 or 2.
E.g., the parent table AA_ARR_PAYMENT_SCHEDULE has a
child table called
AA_ARR_PAYMENT_SCHEDULE_Payment_Freq and the
latter stores a set of associated multi-value from the
former. AA_ARR_PAYMENT_SCHEDULE will have

4 Please note that between Insight.Entities and Insight.Attributes, there is a foreign key relationship
defined however no cascade-update is in effect. Updates made in Entities must be manually syncronized
to Attributes with transaction protection.
Page 37 | 335
Advanced Analytics Platform Technical Guide

ChildLevel set to NULL (and also ParentEntityId set to null)


while AA_ARR_PAYMENT_SCHEDULE_Payment_Freq will
have ChildLevel set to 1 (and ParentEntityId set to the
EntityId of AA_ARR_PAYMENT_SCHEDULE). Now, some of
the multi-values in
AA_ARR_PAYMENT_SCHEDULE_Payment_Freq have sub-
values and these will be parsed in the
AA_ARR_PAYMENT_SCHEDULE_Payment_Freq_Property
table.
AA_ARR_PAYMENT_SCHEDULE_Payment_Freq_Property
will have ChildLevel set to 2 (as this is the Child table of
another Child table) and its ParentEntityId value will be the
EntityId of AA_ARR_PAYMENT_SCHEDULE_Payment_Freq.
TableType The table type in the Source System. Acceptable values are:
Regular (for regular tables), MV (tables generated for Multi-
value parsing), MVSV (tables generated for sub-values
parsing), LocalRef (tables generated for Local Reference
parsing) and LRSV (for Sub values of local ref tables).
DateProcessedStart When processing started
DateProcessed When processing finished
BatchNumber Batch Number used for processing the current table. This
column has been introduced as part of the logging process
RecordCount Number of records in the table
DataProfileDone Defines if Data Profiling was executed on the table.
Acceptable values are 1 (.i.e Yes), 0 and null (which both
mean No).
Active Defines if the table is active. Acceptable values are 1 (.i.e
Yes), 0 and null (which both mean No).
Added When the record was added
Modified When the record was modified
OnlineProcess Internal system column that defines if the record is subject
to online processing. Possible values are 1 (i.e. Yes), 0 (i.e.
No) and NULL (i.e. No)

Insight.KeyColumns
This table is the source of any columns which are not in the Source System Data dictionary i.e. (the Core
banking Standard Selection table copied to Insight.T24StandardSelection in InsightImport) and which are
used to create key columns on a global (All tables) basis. This table should be configured before the first
Analytics ETL begins.

From R19, a new record definition for the TIME_STAMP column has been added for all table types to
facilitate the DW Online integration.

Column Name Description


TableType Regular for CSV’s. MV for multi-value tables, MVSV for sub
values of Multi-value tables, LocalRef for local ref tables,
LRSV for Sub values of local ref tables.
ColumnName Eg. Lead_Co_Mne

Page 38 | 335
Advanced Analytics Platform Technical Guide

DataType Eg. Nvarchar(10)


ColumnOrder This order needs to result in the columns being ordered
before the first column in Standard selection. Assuming the
first columns in the CSV are Lead_Co_Mne,
Branch_Co_Mne, MIS_Date, @ID, but the only @ID is in
T24StandardSelection. Then Lead_Co_Mne,
Branch_Co_Mne, MIS_Date are added to
InsightKeyColumns with order -3, -2, -1.
Parsing Alias Column alias to be passed to the CLR parsing functions.
Child Level The number of levels the table is from the parent table
(similar to Child level column in Insight.Entities).
Primary Index Column Defines if a column is a primary index. Acceptable values
are 1 (which means that the column is a primary index), 0
and null (which both mean that the column is not a primary
index). Please note that primary index columns should not
allow null values.
Configuration Configuration defines the information source for the current
row – available values are:
- ModelBank: this entry has been added to satisfy
Temenos core banking mapping and/or business
rules
- Local: the entry is used during the implementation
to update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record
definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics solution
when deployed

Insight.DataProfiles
This table stores the results of the data profiling process and it is updated by Insight.s_DataProfile_Create
only.

Column Name Description


EntityId This foreign key defines the id of the table being data
profiled and it is taken from the Insight.Entities table
ColumnName Name of the column on which data profiling was executed
e.g. @ID
ColumnMinValue The minimum value of the column, when sorted as text.

Page 39 | 335
Advanced Analytics Platform Technical Guide

ColumnMaxValue The maximum value of the column, when sorted as text.


ColumnMaxLenValue The string in the column with the maximum length. Eg.
ABCDEFG
ColumnLength The length of the column from the source system data
dictionary (T24StandardSelection).
ColumnIsNum Defines if the column is a number. Acceptable values are 1
(i.e. Yes), 0 and null (which both mean No)
ColumnIsBigInt Defines if the column is an Int. Acceptable values are 1 (i.e.
Yes), 0 and null (which both mean No)
ColumnIsDecimal Defines if the column is a decimal. Acceptable values are 1
(i.e. Yes), 0 and null (which both mean No)
ColumnIsDate Defines if the column is a date. Acceptable values are 1 (i.e.
Yes), 0 and null (which both mean No)
ColumnIsNull Defines if the column is null. Acceptable values are 1 (i.e.
Yes), 0 and null (which both mean No)
TestValue Column value string which was used for testing the data
type
TheoreticalDataType Theoretical data type as per what defined in the source
system e.g. [Date]
CalculatedDataTypeFull The Full data type, eg. [DateTime2] created by
s_InsightActualDataTypes_Update stored procedure in
Analytics
ActualColumnLength The calculated actual length of the column
RecordCount Total records counted in the table identified by the EntityId
Added When the record was added.

Insight.ImportSchema
The new InsightImport.Insight.ImportSchema table serves as an important role in Analytics ETL, Process
Data ExStore and the DQI process. In fact, Insight.ImportSchema is the only ‘blueprint’ table used to create
table structures during Data Quality Import. The information about column schema not only ultimately
determines how the table structure is created, but also used as the definitive quality standard for DQI to
determine the final data type for a field value in question. As long as a table is listed in ImportSchema, it
will subject to DQI check during Analytics ETL/Process Data ExStore. Any raw value not conforming to the
specified data type or violating the nullability condition is subject to DQI correction. Except the three system
columns (i.e. ETLD_DQ_RevisionCount, ETL_DQ_ColumnsRevised and ETL_DQ_ErrorMessage), all column
names in an InsightImport data table must be registered in ImportSchema. Otherwise, any unknown
column that has not yet been registered in ImportSchema, is going to be ignored during DQ Import. Any
base table not listed in ImportSchema undergoes Data Profiling Import (DPI) instead of the DQI process.

For this reason, now rows-set will be shipped with ImportSchema as default, i.e. this table will be initially
empty. Then, the initial run of Analytics ETL will operate DPI to create the first set of base table structures.
These initial table structures are then automatically collected into ImportSchema as the drafted import
table structure. Such Data Profiler-based import table structures are only meant for the initial data surveyed
and may not continue to be suitable for the future imports. Therefore, a dedicated resource should
thoroughly review the table structures concluded from the DPI results and make necessary changes to the
drafted column types according to the true meaning of each column’s business logic. Usually, after a couple
runs of revisions, table structure should become stable for long-term ETL operation. The column schema

Page 40 | 335
Advanced Analytics Platform Technical Guide

information within ImportSchema is the definitive quality standards for DQI to create the base and sub
tables, and import data into them.

Important Note: new entries should not be manually entered in this table but
existing column data profile definitions can be deleted to trigger new data profiling
for a table (e.g if new locally developed columns have been added to the table).
Comparing the combined roles of Insight.Entities and Insight.Attributes with ImportSchema’s, the
difference is that the former are only used in the DPI operations that look after data being surveyed –
where irregular data may cause unexpected fluctuations. The latter, however, allows human intelligence to
decide to either accept or override the DPI results – this is more suitable for long-term ETL operation.

In case of absence of related CSV files during the initial Analytics ETL/Process Data ExStore execution,
some empty base tables may still be created as conceptual structures based on the information given in
StandardSelection and SourceSchema. In this empty table scenario, the column schema information is not
collected into Insight.ImportSchema and it will not be processed by DQI until next import is done with a
complete set of CSV splits.

In certain cases, DQI may also get involved with DPI during the initial ETL Import for base tables. In the
event that DPI-Bulk-Insert failed twice, DQI will take over and handle the situation.

Column Name Description


SchemaId Record Id (identity column). Populated automatically.
EntityId This foreign key defines the id of the table that was data
profiled and it is taken from the Insight.Entities table.
TableSchema Schema of the table that was data profiled e.g. dbo.
TableName Name of the table that was data profiled e.g. ACCOUNT.
OrdinalPosition Position of the column in its data profiled table e.g. 1.
ColumnName Name of the data profiled column e.g. CUSTOMER.
DataType General type assigned to the column during the data
profiling process e.g. [nvarchar]
FullDataType The Full data type, eg. [nvarchar(max)], created by
s_InsightActualDataTypes_Update stored procedure in
Analytics
CharMaxLenght The maximum length of the string e.g. 500.
ByteMaxLenght The maximum size per object e.g. 2147483591.
NumPrecision Number of digits in a figure (if applicable) e.g. 100.
NumScale Number of digits to the right of the decimal point in a
number (if applicable) e.g. 2.
IsNullable This is a Boolean column (i.e. contains value 1 or 0) that
defines if a column can have null values or not. It should
be noted that primary key columns are now always set as
non-null in ImportSchema (a set of General DQI rules
related to primary keys are automatically merged into the
InsightETL table called RuleDataQuality right before the
actual import starts).

Info Any additional descriptive text about the column.


Page 41 | 335
Advanced Analytics Platform Technical Guide

Insight.CsvColumns
Each and every CSV split is surveyed in a multithreaded CLR procedure called s_CollectCsvHeaderColumns.
The results of this split process are collected in the Insight.CsvColumns table to create individual views for
each CSV split during Data Quality import. This table contains one row per column per CSV file instance.

Column Name Description


CsvColumnID Record Id (identity column). Populated automatically.
EntityId This foreign key defines the id of the table that was data
profiled and it is taken from the Insight.Entities table.
TableName Name of the table that was subject to CSV split e.g.
ACCOUNT.
CsvFile Name of the CSV file associated with the table that was
subject to CSV split e.g.
BNK_AA_ACCOUNT_DETAILS_0040
CsvView Name of the view created for the individual CSV file split
e.g. v_BNK_AA_ACCOUNT_DETAILS_0040
CsvPath Path to the imported CSV file e.g.
E:\InsightImport\BNK_AA_ACCOUNT_DETAILS_0040.csv
CsvRank The same CSV file can be split into several instances whose
name is suffixed with the Id of the thread used to process
them in Temenos Core Banking. E.g. if the BNK_ACCOUNT
CSV files were processed by thread 39, 40 and 41, there
will be three resulting CSV files called
BNK_ACCOUNT_0039, BNK_ACCOUNT_0040 and
BNK_ACCOUNT_0041. The CsvRank column in this table
will store the ranking or processing order of these CSV files
E.g. the CSVRank for BNK_ACCOUNT_0039 will be ‘1’, for
BNK_ACCOUNT_0040 it will be ‘2’, for
BNK_ACCOUNT_0041 it will be ‘3’ etc.
ColumnName Target name of the column in the table that was subject to
CSV split e.g. @ID
RawColumnName Original name of the column in the CSV file e.g. @ID
ColumnOrder Order of a column within a table E.g. 1
Added Date and time of the last processing e.g. 2018-06-22
14:32:05.080

Insight.Numbers
This table holds some consecutive integers that are used during ETL import. Numbers creation in R18 is
done through recursive CTE instead of cross-join from the system tables, as carried out in earlier releases.

Column Name Description


Number Integer used during ETL import e.g. -1, 0, 1 etc.

Page 42 | 335
Advanced Analytics Platform Technical Guide

Insight.T24LocalRefs
This table presents a full list of local reference columns including information about the source system table
they originate from, their label and target table in Analytics and the characteristics of these local reference
fields e.g. data type, available values etc. This view combines information from Local_table and
Local_ref_table.

Important Note: Insight.T24LocalRefs should not be updated directly, but only by


Analytics ETL based on the content of the above tables. This table’s content is
regenerated based on user input so direct changes will not be preserved.

Column Name Description


TableName Original table name in Temenos Core Banking E.g.
AA.PRD.DES.SETTLEMENT
ColumnName Original name of the local reference field in Temenos Core
Banking E.g. CHKG.ACCOUNT
InsightTableName The new table name in InsightImport E.g.
AA_PRD_DES_SETTLEMENT
InsightColumnName The new column name in InsightImport E.g.
CHKG_ACCOUNT
ColumnDesc Text describing the use of the column
ColumnOrder Order of a column within a table E.g. 1
VettingTable If a local reference field is populated through a
dropdownlist of multiple options, this column will contain
the list of available values separated by a tag. E.g. 290,
240, 60, 901. This column will be empty if the local
reference is a free text field or if the ApplicationVetting
column is populated (see below).
VettingTableValues This column will contain a description for any of the value
present in VettingTable, this column will store a description
for them separated by a tag. E.g. ALBANIE, ALLEMAGNE
DE L'EST, ALLEMAGNE DE L'OUEST, ANDORRE.
ApplicationVetting If a local reference field is populated through a
dropdownlist linked to another table, this column will
contain the name of the linked Temenos Core Banking
table. E.g. CUSTOMER
ApplicationEnrichment When ApplicationVetting is populated and hence the values
of the local reference fields are selected from an associated
Temenos Core Banking Table, the content of this column
defines the enrichment (description) to be displayed when
the field is populated. This descriptive information is
provided using a field from the linked table specified in
ApplicationVetting. ApplicationEnrichment will store the
order of the enrichment field in the source table E.g. 2 (that
means CUSTOMER MNEMONIC if the selected
ApplicationVetting is the CUSTOMER table).
SubAssociationCode This code defines if the local reference is configured to
store sub values or if it is associated with other local
Page 43 | 335
Advanced Analytics Platform Technical Guide

reference fields. The acceptable entries for this column are


the following:
 XX. denotes that this is a standalone local reference
that can store sub values.
 XX< denotes the start of an associated local
references set.
 XX- denotes that this is part of an associated local
references set.
 XX> denotes the end of the associated local
references set.
 Null denotes that this local reference is a standard
standalone field.

Char_type This column stores the Temenos Core Banking character


type (i.e. data type) assigned to this local reference E.g. A
for alphabetic character, CUS for customer number, D for
date etc. Please refer to Temenos Core Banking
documentation for more information about available
character types.

Insight.T24LocalRefsWithSS
This table presents a full list of local reference columns as per what described in the Temenos Core Banking
STANDARD.SELECTION table (imported in the Analytics Platform through the associated
STANDARD_SELECTION CSV file).

Important Note: Insight.T24LocalRefsWithSS should not be updated directly, but


only by Analytics ETL based on the content of the above tables. This table’s content is
regenerated based on user input so direct changes will not be preserved.

Column Name Description


ColumnOrder Order of a column within a table E.g. 1
TableName Table name in InsightImport E.g. FUNDS_TRANSFER
SchemaName Schema name of the table in InsightImport E.g. dbo
ColumnName The original column name in Temenos Core Banking E.g.
AA.REFERENCE
SubAssociationCode This code defines if the local reference is configured to
store sub values or if it is associated with other local
reference fields. The acceptable entries for this column are
the following:
 XX. denotes that this is a standalone local reference
that can store sub values.
 XX< denotes the start of an associated local
references set.

Page 44 | 335
Advanced Analytics Platform Technical Guide

 XX- denotes that this is part of an associated local


references set.
 XX> denotes the end of the associated local
references set.
Null denotes that this local reference is a standard
standalone field.
DataType Data type assigned to the column by the Data Parser
Import process e.g. [nvarchar](50)
BaseDataType Basic data type assigned to the column by the Data Parser
Import process e.g. [nvarchar]
InsightColumnName Column name in Analytics as per SQL naming conventions
E.g. AA_REFERENCE
SelectionName Original application name in Temenos Core Banking as per
specified in STANDARD.SLECTION E.g. FUNDS.TRANSFER
ValidationApplication This column stores the validation type assigned to this local
reference Temenos Core Banking E.g. IN2A for alphabetic
character, IN2CUS for customer number, IN2D for date etc.
Please refer to Temenos Core Banking documentation for
more information about available validation types.

Insight.T24StandardSelection
This table displays the content of the Standard Selection table, imported from the Temenos Core banking
system and containing source data dictionary information. Each row corresponds to the definition of one
Temenos Core Banking table column.

Important Note: Insight.T24StandardSelection should not be updated directly,


but only by Analytics ETL based on the content of the above tables. This table’s
content is regenerated based on user input so direct changes will not be preserved.

Column Name Description


TableName Table name in InsightImport E.g. AA_ACCOUNT_DETAILS
InsightColumnName The new column name in InsightImport E.g.
ACT_PAY_DATE
DataType Data type assigned to the column by the Data Parser
Import process e.g. [nvarchar](50)
BaseDataType Basic data type assigned to the column by the Data Parser
Import process e.g. [nvarchar]
SelectionName Original table name in Temenos Core Banking E.g.
AA.ACCOUNT.DETAILS
ColumnType Defines the type of columns. Valid entries are ‘system’ for
system column and ‘user’ for local reference fields.
ColumnName Original name of the local reference field in Temenos Core
Banking E.g. ACT.PAY.DATE

Page 45 | 335
Advanced Analytics Platform Technical Guide

T24ColumnType Defines the Temenos Core Banking type associated with


this column that identifies how the column is populated e.g.
D stands for Data (the column is a standard data table,
normally populated manually), S stands for Subroutine (the
column is populated by a subroutine), J stands for Joined
Table (the column value is extracted from a joined table)
etc. Please refer to Temenos Core Banking documentation
for more information about the available T24 Column
Types.
ColumnSource Defines the source of the column in the table. When the
T24ColumnType is set to D, this field will simply contain the
order of a specific column in the table e.g. 1. However the
content of this column changes for different
T24ColumnTypes e.g. for a type J, the ColumnSource will
indicate how the current column is linked to a column in
the joined table e.g. the value
CUSTOMER.NO>CUSTOMER>RESIDENCE means that the
current column stores the value of the RESIDENCE field in
the joined CUSTOMER table and uses the content of the
CUSTOMER.NO field in the current table to filter it. Please
refer to Temenos Core Banking documentation for more
information about the available T24 Column Types and
their associated ColumnSources.
ColumnSourceOrder Order of a specific column in the table in which it belongs
e.g. 2. If T24ColumnType is not D, this column will be
empty
ValidationApplication This column stores the validation type assigned to this local
reference Temenos Core Banking E.g. IN2A for alphabetic
character, IN2CUS for customer number, IN2D for date etc.
Please refer to Temenos Core Banking documentation for
more information about available validation types.
MaxColumnLenght Maximum number of characters in the column E.g. 16
SingleOrMultivalue Defines if the column is single-value or multi-value.
Acceptable values are S or M.
VettingApplication If the column defined is linked to an associated table from
which it draws its content (e.g. the ACCOUNT column in the
CUSTOMER table selects its content from the list of
available Ids in the ACCOUNT table), this column will store
the name of the application associated with the column e.g.
ACCOUNT.
CNVType Conversion Type that defines the format of the column’s
value E.g. ACCOUNT.
ColumnOrder Order of a column within the InsightImport table E.g. 1.
RowNumber Number of rows associated with the column. Default value
is 1.
InsightColumnType Type of Column in the Analytics Platform. Acceptable values
are Regular or Local_Ref.
TableSuffix Suffix associated with the Temenos Core Banking table if it
does not store live records e.g. HIS or NAU. Typically

Page 46 | 335
Advanced Analytics Platform Technical Guide

history or unauthorized records are not imported in


Analytics, hence this column will be set to NULL or None.

Insight.T24SubAttributeSourceData
This table displays the list of columns resulting from the parsing of multi values, multi value sub values,
local references and local reference sub values. These attributes will be stored in the so-called ‘sub’ tables,
whose details will be also stored in this table.

Important Note: Insight.T24SubAttributeSourceData should not be updated


directly, but only by Analytics ETL based on the content of the above tables. This
table’s content is regenerated based on user input so direct changes will not be
preserved.

Column Name Description


T24SubAttributeSourceDataId Record Id (identity column). Populated automatically.
EntityId Foreign key identifying the Insight.Entity entry associated
with the current definition i.e. the id of the sub table
associated with the attribute here defined. E.g. 77
TableName Table name in InsightImport E.g.
AA_ACCOUNT_DETAILS_Bill_Pay_Date
ParentTableName Name of the base table associated with the sub table
storing the attribute in InsightImport E.g.
AA_ACCOUNT_DETAILS
Name The ‘sub’ column name E.g. LEAD_CO_MNE
InsightColumnName The column name in InsightImport if different from the CSV
column name. The default value is NULL.
ActualColumnOrder Order of the column in the target sub table created by
Analytics ETL. E.g. 1
ColumnType Defines the type of column, specifically whether they are
multi value, multi value sub values, local references or local
reference sub values. The acceptable values are MV, MVSV,
Local Ref and LRSV.
SourceSystemDataType Data type assigned to the column e.g. [nvarchar](150)
ActualDataType Currently not in use.
InsightDataType Currently not in use.
SingleOrMultivalue Defines if the column is single or multi value. Two options
are acceptable M or S.
ColumnExistsInSourceTable Currently not in use.
ActualColumnLenght Currently not in use.
SourceSystemColumnLenght Maximum number of characters of the column e.g. 40
InsightColumnLenght Currently not in use.
RebuildFromSelectionifMissing Currently not in use.
ColumnLog Currently not in use.

Page 47 | 335
Advanced Analytics Platform Technical Guide

ColumnSource Defines the column source. Acceptable values are


KeyColumn for Primary Keys, Standard Selection for
columns whose definition is extracted from
STANDARD.SELECTION (including local reference fields),
MV and MVSV for fields whose definitions are created
through Multi value and Multi value sub value parsing,
respectively.
Active Defines if the column is active or not. Acceptable values are
1 (Active), 0 or NULL (Inactive).
AllowNulls Defines if the column allows null values. Acceptable values
are 1 (Yes), 0 or NULL (No).
PrimaryIndexColumns Defines if the column is a primary index column. Acceptable
values are 1 (Yes), 0 or NULL (No).
ChildLevel Defines if the sub level of the table that hosts the column.
E.g. when child Level is set to 1, this means that the table
results from the parsing of one or more a multi values or
local references. Child level set to 2, the sub table is the
result of multi value sub value or local reference sub value
parsing.
ParsingAlias Alias label used by the parsing store procedure.
BaseTableName Name of the base table from which the sub table is derived.
E.g. AA_ACCOUNT_DETAILS

Insight.T24MultiValueSubValueTables
This table stores the list of multi-value (MV) and multi-value sub-value (MVSV) tables to be created in
InsightImport during the MV and MVSV parsing phase of ETL. The content of this table is the combination
of the entries listed in the v_T24MultiValues and v_T24MultiValues materialized views (that will be discussed
later in this chapter).

This table is populated automatically the first time ETL runs, based on the content of the SourceSchema
table. If the T24MultivalueAssociation flag is set to “Yes” for a table, all the child tables required for parsing
all the multi-values or multi-values sets within the parent table will be listed in
Insight.T24MultiValueSubValueTables. If the T24SubvalueAssociation flag is set to “Yes” for said table, also
all the child tables required for parsing all the multi-values sub-values or multi-values sub-values sets within
the parent table will be included here. The MV and MVSV will be determined programmatically by ETL,
based on the content of the FIELD_NAME_TEMPLATE field in the STANDARD_SELECTION entry for the
parent table. Once all the MV and MVSB parsing tables have been listed in
nsight.T24MultiValueSubValueTables, the database administrator can choose to activate or deactivate the
actual parsing process performed by ETL on a specific child table through the Active flag or they can rename
the child MV or MVSV table through the InsightTableName column, as discussed below.

Column Name Description


TableName Name of the parsing table created in InsightImport. If the
table is used for multi-value parsing, the name syntax
consists of the parent table name followed by the name of
the first MV field in the set e.g.
AA_ACCOUNT_DETAILS_BILL_PAY_DATE (where
AA_ACCOUNT_DETAILS is the parent table name and
BILL_PAY_DATE is the name of the first field in a MV set).

Page 48 | 335
Advanced Analytics Platform Technical Guide

If the table is used for sub-value multi-value parsing,


instead, the name syntax consists of the parent table name
followed by the name of the first MV field in the set followed
by the name of the first SV field in the set e.g.
AA_ACCOUNT_DETAILS_BILL_PAY_DATE_BILL_ID (where
BILL_ID is the name of the first field in the SV set).
ParentTableName Name of the parent table for the parsing table created in
InsightImport e.g. AA_ACCOUNT_DETAILS
ChildLevel Defines the parent-child relationship between the parsing
table specified in TableName and the parent table specified
in ParentTableName. If the parsing table is used for MV
parsing, then the ChildLevel value will be 1. If it is used for
MVSV parsing, instead, the ChildLevel value will be 2.
TableType Defines the type of parsing table. Acceptable values are MV
or MVSV.
ColumnList Defines the list of columns to be parsed in the MV or MVSV
set. If multiple values are presented, they will be separated
by semicolon. E.g.
BILL_PAY_DATE:BILL_ID:ACTIVITY_REF:BILL_DATE:DEF
ER_DATE:EXPIRY_DATE:BILL_TYPE:PAY_METHOD:BILL_S
TATUS:SET_STATUS:AGING_STATUS:NXT_AGE_DATE:CH
ASER_DATE
Active This flag defines if parsing is enabled or disabled for a
specific MV or MVSB set. Acceptable values are 1 (i.e. Yes),
0 (i.e. No) or NULL (i.e. No). If parsing is activated for a
MV/MVSV table, ETL will create it and populate it with
parsed MV/MVSV. Else, parsing for the associated
MV/MVSV set will be skipped.
InsightTableName If the database administrator decides to rename a child
table for parsing MV/MVSV sets, the new name should be
specified within this column. If this column is left empty,
the parsing table will bear the name specified in the
TableName column.
Configuration This column is used to make version control in this table
more consistent and easier to manage. Configuration
defines what the source for the current row is in the table
(provided out-of-the-box by Temenos, added later by the
client as a result of local development etc.) – available
values are:
- ModelBank: this entry has been added to satisfy
Temenos core banking mapping and/or business
rules
- Local: the entry is used during the implementation
to update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic

Page 49 | 335
Advanced Analytics Platform Technical Guide

- PBModelBank: Used for Private Banking record


definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics solution
when deployed

Online.OnlineOutput
The table InsightImport.Online.OnlineOutput is an internal control table used for micro batches. It is
populated by DW Online insert events and has a list of all records that need to be pushed through DQ,
Multi-valued and Sub-valued parsing.

Column Name Description


OnlineOutputId Record Id (identity column). Populated automatically.
SourceTableName Table name
SourceTableRowId Record Id of the SourceTableName
LEAD_CO_MNE Lead Company Mnemonic
BRANCH_CO_MNE Branch Company Mnemonic
MIS_DATE Business Date of Online data
@ID Unique record Id in T24/Temenos Core Banking source
table
Event Name of event that created record (e.g. INSERT)
EventTime Date and time of event
ProcessStatus Status of record (e.g. New, Completed)
ProcessedTime Date and time of processing
ThreadNumber DW Online thread number
TimeStamp T24/Temenos Core Banking timestamp of source record

Views
Insight.v_ImportFileList
This view is on the Hash Total table and potentially other file list tables.

Insight.v_T24FormLabels
This view uses the VERSION table from Temenos Core banking to extract meaningful headers and column
labels from locally designed screens.

Insight.v_T24LocalRefs
This view presents a full list of local reference columns including information about the source system table
they originate from, their label and target table in Analytics and the characteristics of these local reference
fields e.g. data type, available values etc.

This view combines information from Local_table and Local_ref_table.

Page 50 | 335
Advanced Analytics Platform Technical Guide

Insight.v_T24LocalRefData
This view combines all local ref metadata in one place.

Insight.v_T24LocalRefSubvalueData
This view combines all local ref sub value metadata in one place.

Insight.v_T24MultivalueData
This view extracts its values from the materialized T24MultiValues and T24MultiValueSubValueTables
tables. It will only show tables that have the value of the Active flag set to 1 in
T24MultiValueSubValueTables5.

Insight.v_T24MultivalueSubvalueData
This view extracts its values from the materialized T24MultiValueSubValues and
T24MultiValueSubValueTables tables. It will only show tables that have the value of the Active flag set to
1 in T24MultiValueSubValueTables 6.

Insight.v_SubAttributeSourceData
This view combines the above Subtable metadata into one view.

Insight.v_Attributes
This view combines information from the Insight.Entities and Insight.Attributes tables.

Insight.v_T24StandardSelection
This view displays the content of the Standard Selection table, imported from the Core banking system and
containing source data dictionary information.

Insight.v_AttributeSourceData
This view combines all of the above for final Merge into the Insight.Attributes table.

Functions
fn_T24MultivalueData
This function takes values from the materialized T24MultiValues and T24MultiValueSubValueTables tables.
It will only show tables that have the value of the Active flag set to 1 in T24MultiValueSubValueTables 7.

5
Previously, this view was designed to take MV columns from T24MultiValueAssociation and split them
using functions, then datatype them based on the value of the T24SubValueAssociation field in
SourceSchema that leads MVSV columns to have SS specific datatype instead of nvarchar(max).
6
Previously, this view was designed to take MVSV columns from T24SubValueAssociation and split them
using functions.
7
In pre-R19 releases, this function was designed to take MV columns from T24MultiValueAssociation and
parse them, then datatype them based on the T24SubValueAssociation field in SourceSchema that leads
MVSV columns to have SS specific datatype instead of nvarchar(max).

Page 51 | 335
Advanced Analytics Platform Technical Guide

fn_T24MultiValueSubValueData
This function takes values from the materialized T24MultiValueSubValues and
T24MultiValueSubValueTables tables. It will only show tables that have the value of the Active flag set to
1 in T24MultiValueSubValueTables8.

fn_GetPendingSubTables
This function takes values from fn_T24MultivalueData and fn_T24MultiValueSubValueData for MV and
MVSV tables9.

SQL Stored Procedures


The key stored procedure in InsightImport is InsightImport.Insight.s_Import_Control, which will be
discussed in a dedicated section. This stored procedure manages the control of the process to Bulk Insert
from CSV files to SQL. It uses a number of helper store procedures to accomplish different phases during
the load, data profiling and parsing of data, also to be discussed below. A log of the import activity carried
out by these stored procedures is found in the InsightETL.dbo.EventLog and
InsightETL.dbo.EventLogDetails tables.

Insight.s_ImportBaseTables
Description
This is the controlling procedure used in ETL Agent job to import the base tables from CSV file in parallel
with DQI (Data Quality Import) or/and DPI (Data Profiler Import). With the input parameter @TableName
set to ‘ALL’, it will import every single base table enabled in the SourceSchema table. When assigning a
specific table name to this parameter, instead, the procedure will only import the particular table mentioned.
When importing a new base table for the first time, the procedure normally follows the DPI path to get a
first impression about the column schemas to use and this initial evaluation takes a slightly longer time to
complete. During the DPI, the Insight.ImportSchema table is populated with definitions for the existing
columns and attributes. Afterward the first Analytics ETL, the quicker DQI path is taken. Unless the column
schema information is removed the Insight.ImportSchema table, the target base table’s structure remains
immutable for day-to-day ETL and so long as there is at least one non-system column of this table exists
in Insight.ImportSchema, DQI will be always in effect.
We should note that Insight.ImportSchema includes definitions for system tables and regular non-system
tables. Non-system regular tables are always dropped and created from the column definition table
Insight.ImportSchema; however, re-creating the system tables can be optionally skipped (in debug mode).

For any known table already listed in Insight.ImportSchema, this procedure tries straight BULK INSERT
first during the in DQI process, without Data Profiling. In case of failure due to corrupted or incompatible
data, Data Quality CLR procedure will take over to enforce Data Quality rules. For any new table that is not
defined in Insight.ImportSchema, procedure will call the classic dynamic import with Data Profiler to process
it, then enlist the newly profiled table/columns in Insight.ImportSchema. To continue to parse and import
multi-valued sub tables after finishing base tables import, the s_ImportSubTables should be used instead.

8
Previously, this function was designed to take MVSV columns from T24SubValueAssociation and parse
them.
9
Previously, this function was designed to take MV columns from T24MultiValueAssociation and MVSV
columns from T24SubValueAssociation then parse them.

Page 52 | 335
Advanced Analytics Platform Technical Guide

Steps
1. Loop through the companies in InsightImport.Insight.SourceBusinessEntities to create system
tables (master company is prioritized)
2. Creates system objects and updates Insight.CsvColumns
3. For regular tables for which a CSV is available, perform multithreade bulk insert with Data Quality
(using an empty shell if a definition in Insight.ImportSchema is available)
Inputs
 CsvDir – File system path where the DW.Export CSVs are located. This should be all CSVs in the
same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2).

The syntax for the path is ‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’


 TableName – defines the table to be imported and for which a column list must be created. By
default, it is set to N'All'
 SystemTablesExist – defines whether system tables exist (1) or not (0). Normally, system tables
are to be updated at the beginning of each Analytics ETL. This parameter is only set to 1 (True) in
debugging mode.
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

Insight.s_ImportDataReportErrors
Description
This procedure is in charge of updating InsightImport logs in case any error occurs while data is being
loaded into InsightImport. This stored procedure calculates a count of records in all tables, compares the
record total to Hash Totals for all Base Tables and it throws an error if there is a mismatch. Also, it adds a
row to the Updatelog table for each bad table.

Steps
Calculates a count of records in all tables

Compares to Hash Totals for base tables

It throws an error if there is a mismatch

Writes to the dbo.Updatelog table for each bad table

Inputs
 TableName – defines the name of the table for which information has to be logged
 BatchNum – as described previously

Page 53 | 335
Advanced Analytics Platform Technical Guide

Insight.s_ImportSubTables
Description
This is the controlling procedure used in ETL Agent job to parse and import the ‘sub’ tables using either
DQI (Data Quality Import) or DPI (Data Parser Import). ‘Sub’ tables are multi-valued, sub-value tables,
local references and local references sub value (abbreviated as MV, MVSV, LR or LRSV tables, respectively).

For a new ‘sub’ table whose structure has never been registered in ImportSchema, the slightly lengthier
DPI approach is taken to parse and import the data. Once the table’s definition is registered in
Insight.ImportSchema, and after user has reviewed and approved this new column schema information,
the subsequent ETL runs will take the DQI approach to parse and import the data for this table.

Some major improvements have been applied in R18 on sub table parsing, i.e.:

 Multithreading is performed at fine granularity that is on the row level. On the contrary, paralleling
in R17 was performed at relatively coarse level—logic unit of combined table and procedure as the
granularity. Such improvement further boosts the efficiency of using computing resources;
 DQI works hand-in-hand in memory with sub table parsing threads. Without caching large sets of
data anywhere, each and every multi-value is streamlined from the raw stage to the parsed stage
then to the data quality-assured stage and at last bulk-copied to the target table. Since
multithreaded in-memory parsing is usually faster than disk output, sub table parsing now can go
almost as fast as the disk can get;
 The dedicated store procedure has been simplified. There is only one SQL-CLR procedure
(s_DQParseMulti-value) to call now in R18 for all kinds of sub table parsing.

Steps
1. Loop through Insight.Entities to select ‘sub’ tables to be processed
2. Checks if the table considered has a record in Insight.ImportSchema.
3. If a record exists, the process skips data profiling for the table, prepares an empty shell for the
sub table based on the Insight.ImportSchema definition and then populates it with the results
of the parsing process.
4. If no Insight.ImportSchema definition exists for the table, data profiling is performed and MV.
MVSB, LR and LRSV data is parsed and populated in a new ‘sub’ table. A new
Insight.ImportSchema definition is created as a result of the data profiling process.
Inputs
 TableName – specifies a multi-value, sub-value, local reference or local reference sub value table
name to parse and import. If this parameter is set to 'ALL' or NULL, it will parse and import all the
requested type of sub tables, while we can restrict the parsed table to just one by specifying its
name in this input parameter.
 Table Type – defines the type of table to be processed. The value for this parameter can be set
to ‘MV’ for multi value, ‘MVSV’ for multi value sub value, ‘LR’ for Local Reference (‘Local Ref’ is also
a valid entry) or ‘LRSV’ for local reference sub value.
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Page 54 | 335
Advanced Analytics Platform Technical Guide

 TotalThreads – if set to NULL, 0 or below, it defines the maximum number of available CPUs. If
it contains a number between 1 and 200, it is used to manually assign the number of threads used
to execute the stored procedure, instead.
 OnlineProcess – Default is batch. When value 1 it does only process for tables marked as
OnlineProcess in Entities

Insight.s_T24ConsolKeys_add
Description
This procedure parses Consolidation keys for GL-related tables like RE_CONSOL_SPEC_ENTRY and any CRF
file.

Steps
 This stored procedure compiles a list of columns based on the header row in each relevant CSV
file. Keeping this separate from Import Tables even though lots of duplicate code to allow for future
changes/ flexibility
 Columns stored into a temporary file called ListOfColumns
 Columns are then parsed and loaded from Temp table into Final Table i.e.
Insight.SourceColumns.
 Logs the process outcome

Inputs
 TableName – defines the name of the table to be parsed (exact table name or ‘All’ are
acceptable entries)

 BatchNum – as described previously

Insight.s_BuildCOA
Description
This stored procedure parses the content of any imported CRF table and uses this data to to build a Chart
of Account table called COA_Structure that resides in InsightImport. Insight.s_BuildCOA has no controllable
input parameters
Steps
 This stored procedure selects the names of CRF reports from ExtractLists and counts them. If
General Ledger and Profit & Loss CRF files are separated, this process is iterated twice
 CRF reports lines are extracted and parent lines are identified for each child line
 BS_GROUP (Balance Sheet group) is identified for each line
 The COA_Structure table is formed and populated

Page 55 | 335
Advanced Analytics Platform Technical Guide

Insight.s_SystemObjects_Update
Description
This Stored Procedure updates the Hash_total table and the Standard Selection. As previously mentioned,
the Hash_total Table lists all the CSV files to be imported, while Standard Selection lists the source system
metadata for each column.

It takes a list of Companies or Entities as an input (Table InsightImport.Insight.SourceBusinessEntities), it


loops through the companies, importing all the Hash_Total files for each Company.
Steps
1. Loop through the companies in InsightImport.Insight.SourceBusinessEntities
2. Bulk Insert the Hash_Totals file for each company into InsightImport..Hash_Totals
3. Create Standard_Selection table by Bulk Inserting CSV’s for Standard_Selction into
InsightImport..Standard_Selection.
Inputs
 Pathname – File system path where the DW.Export CSVs are located. This should be all CSVs in
the same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2).
The syntax for the path is ‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’
 ListofCompanies - A table valued Parameter which contains the Companies to be imported,
company values are obtained from InsightImport..SourceSystemEntities.
 DropTables - will result in tables being recreated rather than being re-used.
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Insight.s_SourceColumns_Update
Description
This stored procure populates the table Insight.SourceColumns with all the column names in each CSV to
be imported.

Steps
 Bulk inserts the top row of each CSV into a Temp table.
 A string splitter function splits the row into many rows, one per field.
 The end result is written into the Insight.SourceColumns table, with tablename and
columnname.

Inputs
 PathName – as described previously.
 TableName – the table for which a column list must be created
 BatchNum – as described previously

Page 56 | 335
Advanced Analytics Platform Technical Guide

Insight.s_Object_Create
Description
This procedure creates a table or a view given a list of columns.

Steps
Using dynamic SQL construct and optionally execute Create Statement.

Inputs
 ListOfColumn – table valued parameter of the list of columns and their data types, (null value
is acceptable)

 ExecuteCreate – should the statement be executed or just returned. Acceptable values are 1 =
Execute or 2 = Create
 ObjectType – type of object to be created. Acceptable values are 1 = Table or 2 = View

 BatchNum – as described previously


 QueryString – output of the stored procedure, only presented if ExecuteCreate is set to 2

Insight.s_ObjectsFromList_Create
Description
This procedure creates a given list of tables or views.

Steps
Loops through tables, gets a list of columns from Insight.Attributes and call Insight.s_Object_Create to
create the table.

Inputs
 ListofTables – table valued parameter of the list of tables to be created.

 CreateVarcharMaxTables – creates untyped placeholder tables for initial Bulk Insert so that
data profiling can be done.

 ObjectType – type of object to be created. Acceptable values are 1 = Table or 2 = View.


 CreateTmpTable – defines if a temporary table is created to carry out the task

 BatchNum – as described previously

Insight.s_ImportTable_Update
Description
This procedure Bulk inserts CSV files into their respective tables.

Steps
 For each table, look at Hash_Totals and loop through and import all the CSV’s.

Page 57 | 335
Advanced Analytics Platform Technical Guide

Inputs
 PathName – as described previously.
 TableName – defines the table for which a column list must be created.
 BatchNum – as described previously
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

Insight.s_DataProfile_Create
Description
This procedure creates a data profile of all the tables to be imported. It determines what the correct
datatype of a particular column should be, and is used to update what the source system says the data
type should be.

Data profiling is relatively slow so in order to speed things up the records in a table are ranked in order of
Length and only the set of records which contain at least the maximum length for each column is returned.
To make things more efficient the above is done on the top 50 percent of records.

Because only the top 50 percent of records is profiled there is a risk that the profile will not reflect reality,
if the eventual data insert fails due to this then the data profile is automatically run again with all records.
However, if the insert does not fail there is a risk that the data types could be sub-optimal. The only way
around this is to data profile all records. This can be done with a very simple variable assignment in the
code of the stored procedure.

Steps
 Dynamically creates a table with the following fields for each Imported table.
TableName, ColumnName, ColumnLength, ColumnMaxValue, ColumnMinValue, ColumnMaxLenValue,
ColumnIsNumeric
 Based on the above table another table is created with the following fields, these steps are
separated for performance reasons.
TableName, ColumnName, ColumnMinValue, ColumnMaxValue,
ColumnMaxLenValue, ColumnLength, ColumnIsNum, ColumnIsBigInt,
ColumnIsDecimal, ColumnIsDate, ColumnIsNull, Added
 Insert the results into the Insight.DataProfiles table.

Inputs
 TableName – the name of the table to be profiled. E.g.AA_ARR_Account.
 NumberOfSampleRowsInside – number of sample rows to be profiled
 BatchNum – as described previously

 TotalThreads – as described previously

Page 58 | 335
Advanced Analytics Platform Technical Guide

Insight.s_Attributes_Update
Description
This procedure updates the table InsightImport data dictionary with datatypes and column information
from various sources i.e. the list of columns to be created is derived from the header row of source CSV
files from Core Banking, data dictionary information is taken from Standard Selection and the Analytics
User Data Types are retrieved and then used to override existing data type (actual Data Types are
calculated during Data Profiling by s_DataProfile_Create).
Steps
 Merge Data from InsightImport.Insight.SourceColumns
 Merge Data from T24StandardSelection
 Merge Data from SourceSchema, Primary Index Columns and Rebuild from selection

 Merge Data from Insight Column Override.


 Merge Data from Data Profiling results

Inputs
 TableName – the name of the table to be profiled. E.g.AA_ARR_Account.

 EntityId – the Id of the table in Insight.Entities

 BatchNum – as described previously

Insight.s_T24AllMultiVvalue_add
Description
This procedure parses all types of multi-value tables: Local Ref’s, Local Ref Sub-Values, Multi-values and
Multi-value Sub-Values.
Steps
 Create temporary table based on the Entities and Attributes tables.

 Call one of two CLR’s to parse the multi-values. s_ParseStringToColumns for Local Ref’s, and
s_ParseMultiStringToColumns for all other sub-tables.
 Load from Temp table into Final Table.

 The above is called from Insight.s_Import_Control so the first load is into the final table with
nvarchar(Max) columns and subsequent loads are into the properly typed tables.

Inputs
 TableName – defines the name of the table to be profiled. E.g.AA_ARR_Account_Localref.
 IsGenericTyping – this parameter can be set to 1 for the Data Profiling, so NVarChar(MAX) is
used for each column rather than the actual data types
 BatchNum – as described previously

Page 59 | 335
Advanced Analytics Platform Technical Guide

Insight.s_Import_Control
Description
This procedure is internally invoked by s_ImportBaseTables and s_ImportSubTables to manage the process
to Bulk Insert from CSV files to SQL.

Steps
For each table being run:
 Execute Insight.s_ImportSystem_Update
 Truncate or Drop tables depending on whether @Recreate tables = 1 is specified.
 Create a list of tables to be imported and/or created using the InsightTables table created
previously.
 Execute Insight.s_SourceColumns_Update
 Execute Insight.s_Attributes_Update
 Set @Changes flag based on contents in ##MetaDataChanges, populated in s_Attributes_Update.
 Set @CSV exists flag, is there a CSV for the table.
 Set @DataProfileDone Flag.
 If (@RecreateTables = 1 Or @Changes = 1 Or @ErrorFlag = 1)
 Create New Table with Nvarchar(Max) datatypes.
 Create View on the above table, Bulk Insert is done into the view because the table has more
columns than there exists in the CSV file. The view has the same column list and order as the CSV
file.
 Execute Insight.s_ImportTable_Update
 Update Insight.SourceColumns table with column ID’s from Sys.Columns.
 If data profile has not been done create a data profile for the table.
 Execute Insight.s_Attributes_Update
 Once the data profile is done, typed tables can be created.
 Execute Insight.s_ObjectsFromList_Create to create Typed Tables based on the updated
Insight.Attributes.
 Execute Insight.s_ImportTable_Update, to insert CSV into properly type table.
 If there is an error doing the above, go back and recreate Nvarchar Max tables, and start the
process again.
 If nothing has changed from a previous day's load, Insight.s_ImportTable_Update will be run
after it is determined there are no changes.
Inputs
 PathName – as described previously.
 TableName – name of the table to be processed. Acceptable entries are either the exact name of
the table to be processed, e.g. ACCOUNT, or ‘All’
 ReCreateTables – defines if data tables should be re/created. Acceptable entries are: 1 recreates
tables, 0 loads into existing table if data profile has been done
 TableType – defines the type of tables to be processed. Acceptable entries are: Regular, Local
ref, LRSV (local ref sub value), MV (multi-value), MVSV (multi-value sub-value). It should be noted
that, within Analytics ETL, s_Import_Control will be re-executed five times so that each available
Table Type is processed.

Page 60 | 335
Advanced Analytics Platform Technical Guide

 SystemTablesExists – defines if system tables should be re/created which involves a new data
profiling process. Acceptable entries are: 1 Create System tables, 0 do not create system tables, 2
create system tables when sub-tables are being created.
 BatchNum – as described previously
 TotalThreads – as described previously

Import and Data Profiling


As discussed in the Inputs section above, the SystemTableExists parameter controls whether data-
profiling should be forced or not (please note that data profiling will be triggered if any inconsistencies are
found between imported data and metadata, in any case). We can find below two examples of exec
statement for the two different scenarios:

 Only runs the Import:


Exec insightimport.Insight.s_Import_Control 'E:\InsightImport\11302012', 'all',0,’Regular’,1
 Forces tables to be regenerated and re-data profiled:
Exec insightimport.Insight.s_Import_Control 'E:\InsightImport\11302012', 'all',1,’Regular’,0

Insight.s_CreateImportViewForCsv
Description
This procedure creates a view of the base table for individual CSV, with columns aligned up for straight
Bulk-Insert without a format file
Inputs
 CsvPath – File system path where the DW.Export CSVs are located. This should be all CSVs in
the same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2).

Insight.s_CreateEntities
Description
This procedure procedure creates a list of tables to be created as well as the first CSV file to be called. It
replaces s_Entities_Create (abbreviated as SEC) when importing CSVs with Data Quality as it displays better
performances. However, SEC has not been deprecated, since it's still in use for dynamically importing CSVs
with Data Profiler (without Data Quality).

To better understand the relationship between Data Quality Import (performed by s_CreateEntities) and
Data Profiler Import (performed by SEC), we should note that the three configuration tables involved in
these processes are Insight.Entities, Insight.Attributes and Insight.ImportSchema. Data Quality Import
(DQI), does not use Attributes (but does use Entities) since DQI is assuming all table structures and
attributes are as fixed as the definition in Insight.ImportSchema. For non-system tables, any incoming data
not being able to convert to the predefined data type is subject to Data Quality Correction rules and no
runtime error will be thrown.

Data Profiler Import (DPI) uses both Entities and Attributes (but not Insight.ImportSchema) to perform
dynamic import. Related procedures are: s_Entities_Create and s_Attributes_Update. If any corrupted or
incompatible source CSV data is encountered, bulk-copying to target table will result in a terminating
runtime error.

Page 61 | 335
Advanced Analytics Platform Technical Guide

Under DQI, when a new table (not listed in ImportSchema) is brought in, it reaches out to DPI procedures
to perform dynamic import. Then the structure of newly imported table, determined by Data Profiler, is
collected into ImportSchema. Unless going thru another manual reviewing and revising process, the content
in ImportSchema basically becomes the blueprint for import tables.

Steps
 Updates Insight.Entities with system tables if necessary

 Updates Insight.Entities with base tables if necessary


 Updates the affected EntityIds in Insight.Attributes for backward compatibility
 Updates Insight.Entities with ‘sub’ tables if necessary and refreshes the
Insight.T24SubAttributeSourceData or Insight.Attributes for these sub tables
 Sinchronize EntityIds with Insight.ImportSchema

 Update Logs

Inputs
 Operation – defines the type of tables to be processed and can have the following values:
o [0:All ] - All entities/tables including the following 1, 2 and 3;
o [1:System] – Temenos Core Banking configuration/ dictionary tables including:
dbo.STANDARD_SELECTION, dbo.HASH_TOTAL, dbo.LOCAL_TABLE and
dbo.LOCAL_REF_TABLE;
o [2:Base ] - InsightImport base tables excluding the four system tables indicated in option
1;
o [3:Sub ] - InsightImport ‘sub’ tables including: LocalRefs (LR), LocalRefSub-values
(LRSV), Multi-values (MV) and Multi-valueSub-values (MVSV)
 BatchNum – as described previously

Insight.s_Entities_Create
Description
This stored procedure creates a list of tables to be created as well as the first CSV file to be called. Also, it
logs the process times.

Steps
 Merge data from Hash_Totals
 Merge data from SourceSchema

 Merge Data from Standard Selection, Local_Table a Local_Ref_Table

Inputs
This stored procedure uses the InsightImport..Hash_Totals table to define the list of entities to be created
and the only input parameter is:

 BatchNum – as described previously

Page 62 | 335
Advanced Analytics Platform Technical Guide

Insight.s_CreateImportTableStructure
Description
This stored procedure is used for DQI (Data Quality Import) to cast an empty table structure based on the
corresponding Insight.ImportSchema’s definition then replace the existing data table, if found in
InsightImport.

Inputs
 TableName – The target table that must be processed.
 BatchNum – Batch number used to execute the stored procedure.

Insight.s_CreateImportViews
Description
This stored procedure is used to create individual views for each and every CSV split. DQ Import can handle
both physical tables and views as the targets. During Analytics ETL and Process Data ExStore, the import
process normally uses views as DQI targets because of their simpler column mapping.

Inputs
 TableName – The name of table that must be processed.

 BatchNum – Batch number used to execute the stored procedure.

Insight.s_CreateSystemObjects
Description
This stored procedure updates the Hash_total table (Table that lists all the CSV files to be imported) and
Standard Selection (Table that list the source system metadata for each column). It takes a list of
Companies or Entities as an input, it loops throught the companies, importing all the aforementioned files
for the company. From R19, this stored procedure has been updated to materialize the v_T24MultiValues
and v_T24MultiValueSubValues views, then merge their contents and insert it in
T24MultiValueSubValueTables table.

Inputs
 CsvDir – File system path where the DW.Export CSVs are located. This should be all CSVs in the
same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2). The syntax for the path is
‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’

 ListofCompanies - A table valued Parameter which contains the Companies to be imported,


company values are obtained from InsightImport..SourceSystemEntities.
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Page 63 | 335
Advanced Analytics Platform Technical Guide

Insight.s_DQ_ImportTable
Description
This stored procedure tries to bulk-load multiple CSV files in parallel into the target (base/regular) table.
Target table's structure is casted out of the template table (i.e., Insight.ImportSchema). In case of failure
due to incompatible data (type), affected areas are cancelled and Data Quality CLR procedure takes over
and handles it, according to the pre-defined data quality rules. If specific rule is never defined to the subject
column, generic or default rules are applied. In concurrent streaming process, DQ procedure resolves the
source-target columns mapping, bulk-transfers and corrects data with applicable rules on-the-fly. Resulting
fields and revision detail are outputted on the same row.

Inputs
 CsvDir – File system path where the DW.Export CSVs are located. This should be all CSVs in the
same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2). The syntax for the path is
‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’

 TableName – The table that must be processed.


 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

Insight.s_ReportFailingColumns
Description
This stored procedure takes a source and destination table and returns, assumes that the all source columns
will be inserted into similarly named destination columns, returns failing columns.

Inputs
 TableName – The target table that must be processed.

 TableNameSource – The source table that must be processed.

dbo.s_MergeKeyColumnDQRules
Description
This stored procedure is used, within the Data Quality Import (DQI) process, to merge and apply data
quality rules, especially for primary key columns. Null values in such columns need to be addressed before
converting them to primary keys. Typically, NVarChar-typed columns are revised as ‘{NULL}’, Int-typed
columns are revised as -1 and Date-typed columns are revised as ‘1900-01-01’.

Inputs
 TableName – The target table that must be processed.
 TenantId – Id of the database tenant
 BatchNum – Batch number used to execute the stored procedure.

Page 64 | 335
Advanced Analytics Platform Technical Guide

Insight.s_CollectSchema
Description
This stored procedure collects a table’s column schema information from Insight.ImportSchema. It should
only be used within the InsightImport database’s scope.

Inputs
 TableName – The target table that must be processed.

 SchemaName – name of the schema of the table for which the stored procedure will be executed

Insight.s_ DQ_ImportTable
Description
This stored procedure handles the multithreaded DQI process. It operates the data quality-related SQL-
CLR or T-SQL procedures, or functions to import a specified base table from the CSV file splits that are
placed under a directory.

Inputs
 CsvDir – File system path where the DW.Export CSVs are located. This should be all CSVs in the
same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2). The syntax for the path is
‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’

 TableName – The target table that must be processed.


 BatchNum – Batch number used to execute the stored procedure.
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

Insight. s_TableHasDQIssue
Description
This stored procedure is called after the initial ETL import to find out whether there is any DQ issue in the
selected table. This store procedure can return the following values:

 -1: Table with the name provided does not exist.


 0: No issue found.
 1: Has at least one DQ issue.
Inputs
 TableName – The target table that must be processed.
 SchemaName – name of the schema of the table for which the stored procedure will be executed

Page 65 | 335
Advanced Analytics Platform Technical Guide

Online.s_ProcessImportOnline_Update
Description
This procedure is used to pull near real time Online data into InsightImport dbo schema to apply DQ, Multi-
valued and Sub-valued parsing rules. Once data has been staged is loaded into the Online schema in
Insight Landing.

Micro batches are run by calling this procedure at a set configurable interval. When all the DW Online
intra-day data has been pushed through the Online EOD process moves the data from Landing Online
schema into Landing BS schema and a new Online processing day commences.

Steps
1. Start a new Online micro batch if there is no active one
2. Obtain list of tables to be processed based on new DW Online records
3. Process Online changes to T24/Temenos Core Banking metadata tables
4. Copy data from InsightImport Online Schema to InsightImport dbo schema
5. Do Data Quality checks as data is copied
6. Do Local Ref Parsing
7. Do Multi-value Parsing
8. Do Local Ref Sub-value Parsing
9. Do Multi-value Sub-value Parsing
10. Do Attribute Calculations-Import
11. Load data into InsightLanding Online Schema
12. Validate all intra-day records for all companies have been processed by DW Online. Then Start
Online EOD process
13. Update DW Online Processing agent job freq_interval if changed in SystemParametersLanding
Inputs
 Source Name – Used to pass the name of the source system to be process. In this case it is BS

 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

Online.s_InsightImport_Online_Tables_Create
Description
A system stored procedure that is used to create the InsightImport Online schema tables based on
STANDARD_SELECTION metadata

Inputs
 Table Name – Name of table to be created. Acceptable values are the table name or the keyword
‘ALL’
 RecreateTable – Flag used to force the dropping and recreation of a table. Acceptable values
are ‘1,’, ‘0’ or null

Page 66 | 335
Advanced Analytics Platform Technical Guide

 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Online.s_ImportSystemTables_Update
Description
Online changes to T24/Temenos Core Banking metadata tables ( 'LOCAL_REF_TABLE', 'LOCAL_TABLE',
'STANDARD_SELECTION') are handled by this store procedure

s_InsightImportSystemTables_Update (Deprecated)
Description
This procedure handles CSV files from multiple companies exported separately with different hash total
files. All Hash total files from the separate companies will be loaded into InsightImport at the end of this
procedure and they will be combined into the same Hash Total table. This table will be used to drive
s_InsightImport_Update.

Steps
1. Drops all tables in database for first company import when Drop Tables parameter set to 1
2. Bulk Insert into Hash_Total table from company specific Hash_Total CSV file
Inputs
 Pathname – File system path where the DW.Export CSVs are located. This should be all CSVs in
the same folder even if it is multi-company with different CSVs from different companies as they
should have different prefixes (example: BNK, CO1, CO2).
The syntax for the path is ‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’
 Mnemonic – This is the company mnemonic prefix of the file. As there should only be one set of
CSV files per company in the path, and therefore only one hash total file, this is indicating which
company is being imported. The syntax for this Mnemonic should match exactly to Core banking
Company Mnemonics (IE. BNK, or GB1).
 DropTables – If Drop Tables is set to yes, then all tables will be dropped/truncated as
necessary. Only the first company in the set (BNK) should have this set to 1. This will allow one
or more companies to be imported after the first company without the existing records in the
Hash Total table being dropped. When the first company is added for a new import all records
from the Hash Total table should be dropped.

s_InsightImport_Update (Deprecated)
Description
Once all hash total files have been loaded by s_InsightImportSystemTables_Update,
s_InsightImport_Update can be invoked with the same Pathname used for
s_InsightImportSystemTables_Update. This will run through each file listed in the Hash Total table and
import it into the InsightImport database, all as generic nvarchar(max) datatypes.

Steps
Page 67 | 335
Advanced Analytics Platform Technical Guide

1. Check if the last file has been copied to file path


2. Create tables in import based on CSV file extracts. Columns names are based on the TXT files and the
data comes from the CSV files.
Inputs
 Pathname - – File system path where the DW.Export CSVs are located. This should be all CSVs
in the same folder even if it is multi-company with different CSVs from different companies as
they should have different prefixes (example: BNK, CO1, CO2).
The syntax for the path is ‘DriveLetter:\Folder\Subfolder\’. Example ‘E:\DW.Export\’

s_T24TableFromSelection_Create (Deprecated)
Description
Sometimes certain tables are blank in Core banking on a particular date. For example, a transactions table
with a low volume of transactions if exported per day may not have records on a particular date. When
DW.Export runs its extract it does not create a file for these tables. This causes a problem downstream in
the ETL process. If a mapping view is dependent on that table, for example, it would fail if missing. So that
ETL can process without failure regardless of the availability of these tables we need to create a blank
table.

This procedure creates blank tables based on T24 Standard Selection metadata describing the table
columns and data types, as well as your Source Schema configuration. Only tables with
T24RebuildFromSelectionIfMissing set to ‘Yes’, and also not having any table exported from DW.Export will
be created.

This procedure also properly data types each column in each table by creating a table based on its Standard
Selection metadata and copying data into it from the raw Core Banking-imported table from
s_InsightImport_Update.

Steps
1. Get list of missing tables InsightImport and T24 Rebuild From Standard Selection is set to yes in Source
Schema table and create
2. Determine proper data types for all columns and all tables in Import based on standard selection
3. Check if any data types defined in standard selection do not match data otherwise alter tables to have
proper data types
4. Create primary indexes on tables
Inputs
No Parameters are required for this procedure.

s_T24SourceSchema_Update (Deprecated)
Replaced by s_Import_Control which calls s_T24AllMultiVvalue_add.

Description

Page 68 | 335
Advanced Analytics Platform Technical Guide

Tables coming from Core Banking can have Local Reference, multi-value and sub-value fields on each
parent table. These are stored in columns and need to be parsed out into referential tables with a linking
unique key.

This procedure calls a custom CLR function to parse these records into their own tables and creates the
linking based on the PrimaryIndexColumn specification from the SourceSchema table.

Only tables with T24ExtractLocalRef, T24Multi-valueAssociation, T24ExtractSub-valueFromLocalRef, or


T24Sub-valueAssociation values configured in SourceSchema will be parsed.
T24SkipColumnsInExtractLocalRef will allow you to skip parsing certain fields on the Local Ref.

Steps
Call a number of stored procedures:
1. s_T24Localref_add, create local ref tables based on Standard Selection definition
2. S_T24Localrefsub-values_add, parse multiple sub-values for local ref fields
3. s_T24multi-valueSub-value_add, parse multi-value fields based on source schema definition, parse
sub-values for multi-values
4. S_T24Consolkeys_add, parse consol keys.
Inputs
No Parameters are required for this procedure.

Additional Information
Multi-valued Files
Core banking uses a three-dimensional file structure called a “non-first normal form” data model to store
multiple values for a field in a single record known as multi-valued fields. A multi-valued field holds data
that would otherwise be scattered among several interrelated files.

Two or more multi-valued fields can be associated with each other when defined in the file dictionary. Such
associations are useful in situations where a group of multi-valued fields forms an array or are a nested
table within a file. You can define multi-valued fields as belonging to associations in which the first value
in one multi-valued field relates to the first value in each of the other multi-valued fields in the association,
the second value relates to all the other second values. Each multi-value field can be further divided into
sub-values, again obeying any relationships between fields.

Standard_Selection Table
STANDARD.SELECTION is the dictionary application for Temenos Core banking and each application must
have an entry in STANDARD.SELECTION. There is information about every field available in each of the
Core banking files. This table is extracted from Core banking and a copy resides in the InsightSource
database. Below is a classification of the type of columns in STANDARD.SELECTION.

Standard Selection Column Types


System
 Model bank columns with no multi-value or sub-values.
 Data type conversion is done based on standard selection.
Page 69 | 335
Advanced Analytics Platform Technical Guide

System Multi-Values
 Model bank columns that are essentially multiple columns in one column.
 The columns are parsed into a separate table with multiple columns based on the definition in
SourceSchema.
System Sub-Values
 Model bank columns that are essentially multiple rows in one column
 Columns are parsed into a separate table with multiple columns based on the definition stored in
SourceSchema.
User Defined (Local Ref)
 Local development columns that are contained in Local.Ref column
 Data type conversion is done based on standard selection
 The columns are parsed into a separate table with multiple columns based on standard
selection.
 Excluding which columns to parse is defined in SourceSchema.
User Defined (Local Ref) Multi-Values
 Local development columns that are essentially multiple rows in one column
 Data type conversion is done based on standard selection
 The columns are parsed into a separate table with multiple rows based on standard selection,
local_table and local_ref_table.
 Excluding which columns to parse is defined in SourceSchema.

Examples of Multi-Value and Sub-Value Parsing


We can see below an example of an unparsed multi-value field when it has just been imported from a CSV
file (in this case, the field consists of local references). The BNK_ACCOUNT table has a multi-value field
called LOCAL_REF. In any Core banking table, LOCAL_REF will be a special type multi-value field used to
store locally developed columns. To differentiate between standard multi-value fields and local reference
multi-value fields, we call the latter simply local references or local refs.

Figure 9 - Example of unparsed multi-value Local reference field

If we parse this local reference, a new table will be created called BNK_ACCOUNT_LocalRef in which each
value of the local reference will be allocated an individual column.

Below, we can see what happens when the Local ref has been parsed correctly by Analytics ETL. All columns
have proper core banking names and SQL server data types, based on metadata.
Page 70 | 335
Advanced Analytics Platform Technical Guide

Figure 10 - Example of parsed multi-value Local reference field

Within the same example of parsed multi-value field, we can see that some of the new columns resulting
from the multi-value field in Core banking have, in turn, sub-values. In the example below, the
LENDING_OFFICER and LENDING_ROLE columns (resulting from the parsing of the LocalRef field in the
new ACCOUNT_LocalRef table) contain three sub-values each and are associated with one another.

Figure 11 - Example of parsed multi-value Local reference field where sub-values are unparsed

If we perform sub-value parsing on ACCOUNT_LocalRef, the result table of sub-value parsing will be stored
into a new table called, in this case, BNK_ACCOUNT_LocalRef_LENDING_OFFICER, as shown in the next
figure. LENDING_OFFICER is the name of the first of a series of associated multi-value fields, each of which
can contain multiple sub-values.

Figure 12 - Example of parsed sub-values

Parsing of Multi-Valued Columns


Not all columns in Temenos Core banking are multi-valued therefore before trying to parse a column it is
advisable to find out whether it is Single-Value or Multi-Valued. Check in the Core banking application
front-end and look for multi-valued columns that are associated – group them accordingly. Insight provides
a view v_T24StandardSelection that can be used for this purpose as well. This view is available in the
InsightImport database.

Page 71 | 335
Advanced Analytics Platform Technical Guide

System Multi-Values
In addition to the multi-values in Local reference fields, of which we have seen some examples above, Core
banking also has system multi-values, i.e. standard fields of standard Modelbank application which contain
multiple values.

The Standard Selection table from Temenos Core banking will define whether a specific field on a table is
a system multi-value or a single value field. If we run a query on the v_T24StandardSelection view and
check out the output, we can see some examples of system multi values – if we look at the
CONTACT.CLIENT, CONTACT.TYPE and CONTACT.STATUS field definitions below, we can see that the
value for the content type for all the columns is set to the system while the SingleOrMulti-value column is
set to ‘M’ for multi-value.

Figure 13 - System multi-value on the v_T24StandardSelection view

In order to parse the CONTACT.CLIENT, CONTACT.TYPE and CONTACT.STATUS multi-valued fields to a


new table, the T24Multi-valueAssociation column in the SourceSchema table for the
BNK_CCS_CR_CONTACT_LOG record has to be set to
‘CONTACT_CLIENT:CONTACT.TYPE:CONTACT.STATUS’.

Once the Insight.s_Import_Control store procedure is run with TableType set to MV within the Analytics
ETL process, a new table called BNK_CCS_CR_CONTACT_LOG_Contact_Client will be created. This table
will contain multi-value parsing for CONTACT.CLIENT and its associated CONTACT.TYPE and
CONTACT.STATUS multi-value fields.

System Sub-Values
Once the BNK_CCS_CR_CONTACT_LOG_CONTACT_CLIENT has been created there may be sub-values
separated by a special character in the CONTACT_CLIENT column. In order to parse these sub-valued
fields to a new table, the T24Sub-valueAssociation column in the SourceSchema table for the
BNK_CCS_CR_CONTACT_LOG record has to be set to
‘BNK_CCS_CR_CONTACT_LOG_CONTACT_CLIENT|CONTACT_CLIENT’. Once the
Insight.s_Import_Control store procedure is run with TableType set to MVSV within the Analytics ETL
process, a new table will be created as BNK_CCS_CR_CONTACT_LOG_Contact_Client_Contact_Client.

User Defined (Local Ref)


In addition to system multi-values, Standard selection also stores the description of user-developed multi-
value fields. We can see some examples in the figure below, where a query on the v_T24StandardSelection
view filters entries with ColumnType set to User and with Selection Name column set to CUSTOMER. Again,
we can identify multi-value fields by looking at the SingleOrMulti-value column, which is set to ‘M’.

Page 72 | 335
Advanced Analytics Platform Technical Guide

Figure 14 - User multi value (local references) on the v_T24StandardSelection view

To be able to parse any of the LocalRef fields shown above to a new table, the T24ExtractLocalRef column
for the BNK_CUSTOMER record in SourceSchema has to be set to ‘Yes’. Once the Insight.s_Import_Control
store procedure is run with TableType set to Local ref within the Analytics ETL process, a new table will
be created as BNK_CUSTOMER_LocalRef.

User Defined (Local Ref) Multi-Values


Parsing the sub-valued LocalRef columns for the BNK_CUSTOMER table requires one additional step. The
T24ExtractSub-valueFromLocalRef column in the SourceSchema table for the BNK_CUSTOMER record has
to be set to ‘Yes’. Once the Insight.s_Import_Control store procedure is run with TableType set to LRSV
within the Analytics ETL process, multiple tables will be created as per Standard Selection definition of
Multi-valued columns. For example, the new table used to parse sub-values for the CONTACT.TYPE column
will be named as BNK_CUSTOMER_LocalRef_Contact_Type.

R19 Multi Value and Sub Value Parsing Enhancements


From R19, the multi-value (MV) and multi-value sub-value parsing (MVSV) feature has been enhanced as
follows:

 The MV/MVSV split has a new parameter table, i.e. T24Multi-valueSub-valueTables, with a pre-
populated list of all MV/MVSV associated columns for all the tables defined in SourceSchema with
T24Multi-valueAssociation and T24Sub-valueAssociation set as ‘Yes’.
 A parameter field has been added to SourceSchema to enable or disable the MV/MVSV parsing and
ETL splits those columns based on the new parameter table.

The new functionality was introduced to eliminate user errors occurring when any MV/MVSV-associated
columns that it is configured to be parsed is missing. The analytics system will list all the MV/MVSV-
associated columns in Source Schema and the user has to manually to enable column parsing by setting
the related flag in this configuration table. Based on the content of the latter, the stored procedure
s_CreateSystemObjects will populate the T24Multi-valueSub-valueTables table.

New functionality
Two new views and one new table will be created when Analytics ETL runs for the first time.

1. v_T24Multivalues

2. v_T24MultivalueSubvalues

3. T24MultivalueSubvalueTables

Page 73 | 335
Advanced Analytics Platform Technical Guide

The value of the FIELD_NAME_TEMPLATE field, from the Temenos Core Banking STANDARD_SELECTION
table, is used to get value marker of each field and to determine how it should be parsed. The meaning of
each value marker is listed below10.

 XX< Start of Associated multi-value


 XX- Items in associated multi-value
 XX> End of Associated multi-value
 XX<XX< Start of associated multi-value and sub-value
 XX-XX< Items in associated multi-value with Start of associated sub-value
 XX-XX> Items in associated multi-value with the end of associated sub-value
 XX>XX> End of both associated multi-value and sub-value
 XX. Individual multi-value if not surrounded multi-value (previous and after fields does not have
XX< or XX>)
 XX.XX. Individual sub-value if not surrounded by associated sub-value

Old process
Until R18, the database administrator had to manually configure the MV/MVSV to be parsed in Source
Schema and Analytics ETL was splitting only those columns. If a specific MV/MVSV parsing was defined in
SourceSchema and a corresponding CSV file did not exist, ETL would create the table and datatype it based
on STANDARD.SELECTION table.

On a subsequent run, if the CSV file exists for the MV/MVSV, the ETL process would try to insert the
MV/MVSV value into the previously datatype field and fail if the data typing/profiling had not been accurate.
That would trigger Data Quality (DQ) process to start and to apply the required replacements.

If a table had been configured with T24MultiValueAssociation and not T24SubValueAssociation but it
contained MVSV columns in T24MultiValueAssociation, DP would do the data typing/profiling based on the
T24SubValueAssociation field. This would fail and trigger DQ for replacements.

Configuration
Configuring Source Schema (Add/Remove/Configure Tables)
Using the Source Schema Definition in the dedicated section, which describes the different columns and
available syntax, each table to be imported must be listed and configured as per the definition of the table.

By default with Analytics, a certain pre-configured version of Source Schema will be provided to work on
an unmodified Model Bank configuration for Temenos Core banking. At this point, you will need to configure
SourceSchema to reflect what is different from your bank’s implementation of Core banking (at a table
level) from what Model Bank provides. Typically this means turning tables off by removing rows entirely or
adding new records for additional tables to be brought in.

Multi-Value, Sub-Value, and Local Reference tables should be configured within the parent table record in
Source Schema. They will not need separated rows because there are configuration columns for this on
the parent table row.

In general, for each table the following steps need to be taken:

10There are language specific fields that will have additional prefix LL_ that need to be replaced to match
with actual SS fields.
Page 74 | 335
Advanced Analytics Platform Technical Guide

 Ensure the desired table is being extracted from DW.Export (see DW.Export Configuration guide
for assistance)
 Ensure desired CSV file is in the directory which ETL has been configured to read from. There
may be one or more CSV files for each file being extracted from Temenos Core banking.
InsightImport will handle this and there are no configuration changes needed if multiple CSV files
are present for a particular core banking file.
 Add a record (if one does not already exist) in the Source Schema table, configuring
SourceSchema as per this table’s definition and syntax in the SourceSchema section.
 Run the InsightImport procedures to test you have properly configured by reviewing the output.
Ensure the appropriate local ref and multi-value tables have been created and have been data
typed accordingly.

Running the procedures / Configuring the Analytics ETL Job


The Analytics ETL Job is a SQL Agent Job which is configurable per client to execute to the specific workflow
for a particular client. It will be configured mostly to accept different procedure parameters but may have
additional workflow built in.

InsightImport can be configured to run as part of the Insight SQL agent job or ran manually via the stored
procedure. Most of the Import procedures are orchestrated by Insight.s_ImportBaseTables, to load the
content of CSV files to Import tables, and Insight.s_ImportSubTables. The latter must be executed different
times with different input parameters, to parse local reference fields, first, then multi-values, multi-value
sub-values and local reference sub-values.

Stored procedures execution should comply with the following order and input parameters. E.g.

USE InsightImport

Exec [Insight].[s_ImportBaseTables] @CsvDir = '<CSV files path>', @TableName = 'All',


@SystemTablesExist = 0, @BatchNum = null, @TotalThreads = null; Exec
[InsightImport].[Insight].[s_ImportDataReportErrors] @TableName = 'All', @BatchNum = null;

Exec [Insight].[s_ImportSubTables] @TableName = 'All', @TableType = 'LocalRef', @BatchNum = null,


@TotalThreads = null;

Exec [Insight].[s_ImportSubTables] @TableName = 'All', @TableType = 'MV', @BatchNum = null,


@TotalThreads = null;

Exec [Insight].[s_ImportSubTables] @TableName = 'All', @TableType = 'MVSV', @BatchNum = null,


@TotalThreads = null;

Exec [Insight].[s_T24ConsolKeys_Add] @TableName = 'All', @BatchNum = null;

Exec [Insight].[s_BuildCOA]

Logging
Logging for all InsightImport activities is currently being recorded in dedicated tables within the InsightETL
database.

Page 75 | 335
Advanced Analytics Platform Technical Guide

InsightLanding
Overview
InsightLanding is a multi-source archive used to land all source data together. It holds multiple days of
source data and it is the only relational database other than InsightWarehouse that does so in the Analytics
solution. Raw source system data can be consumed at this point by reports, various other types of Analytics
web contents or by other systems.

Any data that may need to be reprocessed into InsightWarehouse has to be retained in InsightLanding.
The primary purpose of Landing is to facilitate easy reprocessing of all historical data for Insight Warehouse
without the need to maintain or re-import separate copies of source system data (e.g. CSV files extracted
from core banking).

Online vs Batch processing


As previously explained, from R19, InsightLanding can also include a set of online tables that will be updated
with intra-day data after a certain time interval. The database administrator can define which tables will be
updated in a near real-time manner and which tables will be subject to batch updates through the
ExtractList configuration table.

If batch processing is selected for a table, tables will have <source-system>.<table-name> schemas table,
e.g. BS.CUSTOMER or Budget.GLBudget, and they will be updated when the daily ETL is run with the latest
business data from the relevant source system. E.g. BS.CUSTOMER will load a copy of the latest content
of the BS.CUSTOMER table in InsightImport every day (which in turn is populated with the latest
CUSTOMER data from Temenos Core Banking). BS.CUSTOMER, like any other InsightLanding table, will
represent an archive of all the historical that has been loaded to Analytics via ETL. Users will be able to
query data for the appropriate business date from the BS.CUSTOMER table in InsightLanding filtering on
the MIS_DATE column.

If an InsightLanding table is flagged for online processing, the first load will also populate the standard
<source-system>.<table-name> schema table e.g. BS.CUSTOMER. However, after this first load, any intra-
day updates will be loaded into “Online” schema tables e.g. Online.CUSTOMER. These online tables will be
updated with any new records coming from near real-time tables in InsightImport after a specified time
interval. Furthermore, during the Online End of day (EOD) process the content of the online tables will be
moved to the associated <source-system>.<table-name> schema table. E.g. ETL will copy the intra-day
records from Online.CUSTOMER to BS.CUSTOMER, then erase the content of the former. When the next
batch Analytics ETL or Process Data ExStore runs intra-day, data will already be loaded.

InsightLanding includes a number of abstraction views for both batch and online tables (e.g. BS.v_Customer
and Online.v_Customer, respectively). While the former are based only on batch InsightLanding tables such
as BS.CUSTOMER, the latter will query a full set of all Extract List tables (including Batch only) and will be
a union of online deltas and previous business date batch. However, for STMT_ENTRY, CATEG_ENTRY,
RE_CONSOL_SPEC_ENTRY (account/GL transaction tables) only intra-day data will be shown in the Online
views

If a traditional batch update is selected for a table, data is landed using storage in columnstore format.
Tables storing intra-day updates, instead, will make use of row-base storage format.

Columnstore Index
Columnstore index is a Microsoft SQL Server technology for storing, retrieving and managing data by using
a columnar data format. This type of index stores data column-wise instead of row-wise.

Page 76 | 335
Advanced Analytics Platform Technical Guide

From R18, Temenos Analytics has adopted and implemented columnstore index technology in
InsightLanding. The main benefits of using columnstore index in InsightLanding is high compression rates
and high query performance gains over traditional row-oriented storage. Requirements for disk space have
been dropped by up to 90%.

All dated schemas are now changed to a single source named schema, for example BS for Banking System.
For instance at table from the core banking system would have a schema of BS.TableName and all historical
data for extracted dates will be stored in this table. All source tables that are landed are converted to
clustered columnstore index.

All source data is stored raw, exactly as it was extracted without any transformations with the exception of
core banking data which has additional tables created in InsightImport to deal with the multi-valued data
and added data types (see Configuration section of the InsightImport chapter for details about this
process). Additional calculated columns on source data could be added to individual tables as per rules
defined in the AttributesCalculation table and in the Rules Engine-related tables of the InsightETL database.

Temenos Core Banking multi-value fields can be larger than the biggest supported string type
(NVARCHAR4000) and would need to be stored as MAX data type which cannot be kept in columnstore.
When data load process encounters column values greater than the largest supported type a process is
triggered to move the column to row store (MaxString tables). Only unique columns values are stored in
the rowstore lookup with an associated hash value and the original table column values are replaced by
the hash lookup value. Rowstore lookup table is using the most efficient storage possible by only retaining
the unique values for each column that surpasses the supported data types

Reporting views join both the Columnstore index base tables and the lookup table making the split seamless
to the end user.

ETL batch control, multithreading process and all logging hasn’t changed in this release.

It is important to mention that processes that load data to InsightLanding using the dated schemas as used
in previous releases have not been sunsetted and are fully supported.

Specific Features / Functions


Feature Description
Source Data Archiving All source data to be used in the Advanced Analytics Platform solution
or to be stored historically and consumed by Analytics or other
external applications is archived in Insight Landing in its raw source
system format under a single schema per source system.
Renaming Source Tables Used to rename source tables and store the table in Landing under
the new name. This is useful, for example, if a core banking system
table has changed names but is still exactly the same structure. You
can use this facility to rename the table so the downstream mapping
that takes place does not need to be changed and can use the original
table name.
Filtering Source Data Any source table can be filtered to control the amount of data being
stored in landing. This filtering can be configured to limit the number
of rows or columns being imported for each source table.
Data Compression Source data is compressed using Columnstore index technologies
which provides high compression rates over traditional row-oriented
storage. Requirements for disk space have been reduced by up to
90%.

Page 77 | 335
Advanced Analytics Platform Technical Guide

Online Reports Source InsightLanding provides a number of new abstraction views under the
Online schema that can be used to feed reports with near real-time
snapshots of a table’s content. In case only intra-day data is needed
in a report, the data should be retrieved from the Online tables
directly, instead

Technical Details
Architecture
In the figure below, we can see how the Insight Landing database fits in the Advanced Analytics platform’s
architecture.

Figure 15 - InsightLanding in the Core Analytics ETL Flow

Page 78 | 335
Advanced Analytics Platform Technical Guide

Technical Components
Tables
dbo.ExtractList
The ExtractList is a configuration table in InsightLanding that contains a list of all source tables to be
imported into Landing from all source systems. The table has one record per source system table and is
used to configure which tables are imported into Landing, whether or not to rename them, how if at all the
source table should be filtered and how long to retain historical data for a particular table.

If Temenos Core banking is used as a source system, the content of the SourceSchema table in Insight
Import must be consistent with the content of ExtractList.

Column Name Description


ExtractListId Record Id (identity column). Populated automatically.
SourceName This is the source identifier that will be added to the schema
along with the date of each import into the landing database.
An example would be BS for Banking System.
SourceServer If the source data being imported is from a separate linked
server you can reference the linked server name here. The
syntax is just the server name portion of a fully qualified
database reference.Eg.
ServerName.SchemaName.TableName
SourceDB Contains the database name for the database that contains the
source table.
SourceSchema Contains the schema name for the source table.
SourceTable Contains the source table name. E.g. InsightImport, Budget etc.
TargetTable This should be populated if the source table name needs to be
changed. The table will be stored in Landing with the name
entered here.
ImportFlag This flag controls whether or not a particular table will be
imported into Landing. You can use this to disable a particular
table import but still retain the record in the Extract List.1
Acceptable values: 1 for yes and 0 for no.
ImportFields This column is used to control which columns are imported from
the source table. If all columns are to be imported then enter *
otherwise enter a comma-separated list of the desired columns.
Acceptable values: * or <ColumnName1>,<ColumnName2> …
<ColumnNameN>.
ImportOrder Used to control the order in which source tables are imported
per source system. Use an integer number to create an ordinal
list for each table in a source system if required. Otherwise set
all records for a source system to 1 to have them import at the
same time.
WhereClause If the number of rows needs to be reduced for a source table
you can enter a where clause statement here which will be
applied to the import select statement. Any SQL entered here
will be applied after the WHERE clause of the import select
statement.

Page 79 | 335
Advanced Analytics Platform Technical Guide

Syntax: TransactionDate = ‘2014-01-01’ and TranType =


‘External’
UserId The authorized user (can be user or schema) for the table.
Deprecated and no longer in use.
PurgeOlderThan Used to control the number of months retained in Landing for
each source system table when the
s_InsightLanding_CSI_Purge stored procedure is executed and
any extracts older than this number of months is deleted. The
procedure has the option to exclude month-end dates.
Syntax: 90
For backwards compatibility when using dated schemas
CombinedTranDays Defines the number of days of transactions that should be
combined while copying the currently described entity from
Insight Landing to Insight Source
PrimaryIndexColumns Enter one or more columns to be used as the primary index for
the table. If you need a composite index then enter each
column separated by a comma.
Syntax: <ColumnName1>,<ColumnName2> …
<ColumnNameN>.
CreateCombinedViews A flag used to combine data from all dates available for a table
or just the current ETL date data
For backwards compatibility when using dated schemas
FlexStringLoad An internal system flag used to mark a table for Flex String Load
processing. It is set when at least a column in a table contains
a string value with more than 4000 characters
BSDateColumn Indicates the business date column name for source system.
This column will be used when deleting and reloading existing
data. In the case of Temenos Core Banking source data this
column is MIS_DATE
BSDateColumnAdd When a source table doesn’t contain a business date column,
this field has to be set to 1. This flag instructs the code to add
a column to the table as per BSDateColumn name definition to
stamp the rows in the table with a business date
BSDateJoinClause Specifies the JOIN condition that used to obtain the source
business date. Only needed when BSDateColumnAdd is set to
1
OnlineProcess This is a flag to set whether the table considered is populated
through the batch method or online. It was introduced due to
the necessity to separate entries in ExtractList for Online
process, as their configuration of the SourceDB, SourceSchema
and SourceTable columns will change. Acceptable values are 1
(which means Online), 0 (which means Batch) or NULL (also
means Batch).
Configuration Configuration defines the information source for the current
row – available values are:
- ModelBank: this entry has been added to satisfy
Temenos core banking mapping and/or business rules

Page 80 | 335
Advanced Analytics Platform Technical Guide

- Local: the entry is used during the implementation to


update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record
definitions
- CampaignAnalytics: Used for Campaign Analytics
solutions
- Predictive: Used for Predictive Analytics solution
when deployed

dbo.ExtractSourceDate
The ExtractSourceDate table is used to store the queries that are used to retrieve the current extract date
from the source system data. One record for each source system defined in the Extract List table is required.
The date returned from the query will be used to create the date portion of the schema used for each table
stored in InsightLanding. If a source system makes use of both batch and online extraction for its tables,
this source system will have two entries in ExtractSourceDate, one with the OnlineProcess flag set to 1 and
the other with this flag set to 0 or NULL. The business date for online extraction will be the business date
used for batch extraction plus one business day.

Please note that Dates tables being used to control ETL dates for each source systems. These Dates tables
need to be configured for batch processing and they are built by APIs.

Column Name Description


SourceName Name of the source system that an extract date will be set for.
This should be identical to one of the source system names
defined in the Source Name column of the Extract List table.
Example: BS
BSDateSQL Contains the SQL Select statement that sets the variable
@bsdate used in the Landing update procedure to the current
extract date from the source system. Must return only one
date.
Syntax: select @bsdate = ExtractDate from
SourceDB.SourceSchema.SourceTable
OnlineProcess This is a flag to set whether the source system considered is
populated through the batch method or online. Acceptable
values are 1 (which means Online), 0 (which means Batch) or
NULL (also means Batch). The Banking System (BS) source will
have two entries in this table: one definition for BS Online and
the other for BS Batch processing.
Configuration Configuration defines the information source for the current
row – available values are:
- ModelBank: this entry has been added to satisfy
Temenos core banking mapping and/or business rules

Page 81 | 335
Advanced Analytics Platform Technical Guide

- Local: the entry is used during the implementation to


update or enhance Framework or ModelBank
functionality
- Framework: this entry has been added to the TFS
Framework solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record
definitions
- CampaignAnalytics: Used for Campaign Analytics
solutions
- Predictive: Used for Predictive Analytics solution
when deployed

<Source Name>.SourceDate
The SourceDate table is used to store all the business dates for which data has been loaded into
InsightLanding for each source system used. One SourceDate table for each source system defined in the
Extract List table is required e.g. BS.SourceDate for Temenos Core Banking, Budget.SourceDate for Budget
etc. Each row in the table represents one individual business date within a specific source system from the
least to the most recent. The dates stored in this table will be referenced by the
s_InsightLanding_CSI_Purge stored procedure during the purging process.

Column Name Description


<Source Name>BusinessDate e.g. Business date for which data was processed into
BSBusinessDate InsightLanding within the source system considered. Dates are
stored in a YYYY-MM-DD format E.g. 2018-04-16
Purged This flag determines if a purging process has even run for this
particular business date, however it does not specify whether
purging affected all tables or just some of them. Acceptable
values are 1 (i.e. a purging process has previously taken place
for this business date) or 0 (i.e. no purging process has ever
taken place on any table for this business date).

dbo.MaxString
This table, introduced in R18, contains column values which contain size more than 4000 characters. The
views will use CLR function to get the values from this table and populate in those respective columns.

Column Name Description


MaxStringId Id of the row entry in table
Sha1Hash It is the hash value of the column that exceeded 4000
characters
MaxString It is the raw unchanged string that exceeded 4000 characters
OccurrenceCount Counter of the number of occurrences of the string

Page 82 | 335
Advanced Analytics Platform Technical Guide

dbo.MaxStringCounter
This table, introduced in R18, contains column values which contain size more than 4000 characters. The
views will use CLR function to get the values from this table and populate in those respective columns.

Column Name Description


MaxStringId Id of the row entry in table
Occurence Occurrence of the string that exceeded 4000 characters
DatabaseName Name of the database where the occurrence took place
SchemaName Name of the schema for the table where the occurrence took
place
TableName Name of the table where the occurrence took place
ColumnName Name of the column where the occurrence took place

dbo.T24Entities
The T24Entities and the T24Attributes tables have been introduced from Release 2017 to comply with the
regulation requirements defined by BCBS 239 (Basel Committee on Banking Supervision's regulation
number 239), with subject title “Principles for effective risk data aggregation and risk reporting”.

These two tables are used to store in InsightLanding, respectively, the description of each table and the
description of each column imported from Temenos Core banking. This information is extracted from
Temenos Core banking’s help text files and is, therefore, consistent across the two systems.

The structure of the T24Entities table is covered below, together with a description of all the columns in
this table.

Column Name Description


EntityId Record Id (identity column). Populated automatically.
T24HelpFile The absolute path to the location where help text files from
T24/Temenos Core banking are located e.g. C:\TFS
DataPath\MB\COMMON.FIELDS.xml
Product The product code for a specific application, as per what
assigned in Temenos Core banking. This is a two letter code
e.g. AA
Name Name of the table in Core banking e.g. AA.ARR.CHARGE
Name_Insight Name of the InsightImport table corresponding to the imported
Core banking table e.g. AA_ARR_CHARGE
Description Description of the table content, structure, and use.

dbo.T24Attributes
The T24Attributes table, as previously mentioned, stores the description of each column imported from
Temenos Core banking. This information is extracted from Temenos Core banking’s help text files.

The structure of the T24Attributes table is covered below, together with a description of all the columns in
this table.

Column Name Description


AttributeId Record Id (identity column). Populated automatically.

Page 83 | 335
Advanced Analytics Platform Technical Guide

EntityId Foreign key which defines which table a specific column


belongs to, i.e. the record id in the T24Entities table
Name Name of the column in Core banking e.g. OVERDRAFT.STATUS
Name_Insight Name of the InsightImport column corresponding to the
imported Core banking column e.g. OVERDRAFT_STATUS
Description Description of the column syntax and use.

dbo.ViewMetadata
ViewMetadata is an Internal Table available for debugging data typing/profiling on columns.

Field Name Description


ItemTableName A particular instance of the table for which the view is
created.
TableName The table name. E.g. GLBudget, ACCOUNT etc.
ColumnNameAs The Name of the Column, e.g. BusinessDate.
LeftSchema The schema related to the ItemTableName in
InsightLanding e.g. 20170101BUDGET, 20170417BS etc.
SchemaName The schema related to the InsightLanding table where the
column with the largest ranking data type is stored e.g.
20170101BUDGET, 20170417BS etc.
ColumnName The column name of a particular instance of a table. Will
be used e.g. BusinessDate as ColumnNameAs
DataType The data type to cast a column to in the case when data
types across schemas are different. This is the maximum
sized data type of all the instances of the particular table
e.g. [date], [nvarchar](50) etc.
ItemDataType The datatype of a particular column e.g. date, nvarchar etc.
ItemFullDataType The full datatype of a particular column e.g. [date],
[nvarchar](50) etc.
ItemMaxLength The maximum length of a particular column e.g. 3, 50 etc.
ColumnOrder The column order within the table, e.g. 1, 2, 3 etc.
Max_lengthAll The max length of a column across all instances of the
column and for all data types e.g. 3, 50 etc.
MinDataType The smallest data type instance of a particular column E.g.
date, nvarchar, float etc.

dbo.DQSummaryLog
This table stores provides logging facilities for all InsightLanding tables that were subjected to Data Quality.

Field Name Description


MIS_DATE Business date in which DQ was processed
TableName The table name. E.g. GLBudget, ACCOUNT etc.
RevisionRowCount Count of the table’s rows that underwent DQ revision
RevisionColumnCount Count of the table’s columns that underwent DQ revision

Page 84 | 335
Advanced Analytics Platform Technical Guide

dbo.SystemParametersLanding
This table is a Data Manager Custom Table rule that where system parameters that are specific to the
InsightLanding database are defined. As discussed in the Rules Engine section, the types and values for a
given system parameters are mapped. When the rule creation steps are run by an agent job, the
s_CreateRuleGroup procedure creates this table, hence all columns in this table are populated
automatically.

Column Name Description


SystemParametersLandingId Record Id (identity column).
ForeignKey Id of the associated Custom Table rule.
Value Value of the parameter to be defined by the business
rule. E.g. C:\Program Files\Microsoft SQL
Server\MSSQL13.MSSQLSERVER\MSSQL\DATA
Name Name of the parameter to be defined by the business
rule. E.g. SQL Data File Path
Type Type of process considered. E.g. Partitioning.
Source Source of the parameter defined, if any.

Views
The Insight Landing database provides abreaction views to select data from all batch tables loaded in the
landing database and use the same schema name as the columnstore index tables. MIS_DATE filter for
business date of source data should always be used.

If Flexstring load is done on a table for the BS source system the views will join the columnstore index
tables to the MaxString tables to present all fields including the raw full strings that are longer than 4000
characters.

A view will be created for each data source in Landing with all the business dates with data. The syntax
for these views will be <Source System > .v_BusinessDate. E.g. for the Core banking data source, the
view will be called BS.v_BusinessDate. This can be used to filter reports that access InsightLanding tables
directly.

From R19, it is also possible to build online views in InsightLanding. Online views will only be created if the
corresponding table definition in ExtractList has been set as OnlineProcess = 1. Landing Online views will
be a full set of all Extract List tables (including Batch only). They will include a union of online deltas and
previous business date batch data. It should be noted that it is important to pay attention to MIS_DATE
when doing joins.

SQL Stored Procedures


dbo.s_InsightLanding_CSI_Table_Update
Description
This procedure is used to bring in new source data into Insight Landing columnstore index tables or to
update existing source data already stored in Landing. The procedure needs to be called for each source
system.

Page 85 | 335
Advanced Analytics Platform Technical Guide

Steps
14. Check if source date is defined in InsightETL, if not insert new date
15. Check if a schema exists for source system for data being loaded. If not create it.
16. Create new Clustered Columnstore Index tables in the schema if they do not exist. Create also the
corresponding view by calling s_InsightLanding_CSI_Views_Create
17. Load data for current ETL date. If there is data already delete first and then reload
18. If data load encounters column values greater than the largest supported type a process is
triggered to move the column to row store (MaxString tables). The view for the table is updated
to join both the base Columnstore Index table and the lookup table making the split seamless to
the end user
19. Call s_InsightLanding_CSI_BusinessDateViews_Create to create a view with all the business dates
with data for the source system.
Inputs
 Source Name – Used to pass the name of the source system to be imported into Insight Landing.
This must match one of the Source Names defined in the Extract List table. Example: BS
 BS Date – Supply a query that returns the current business date from InsightETL. Typically Select
BusinessDate from InsightETL.dbo.CurrentDate.
 TableToLand – Supplies the name of the table to be loaded in InsightLanding. Acceptable values
are the table name or the keyword ‘ALL’
 User Id – Supply the schema name for the default Insight Landing schema. This should be the
same as in Extract List and is almost always dbo.

 CreateRowBaseIndex – enables non-clustered index creation. Acceptable values are: 0 (default


- no row base index); 1 (to create non-clustered primary key)
 DataCompression – COLUMNSTORE (Default. Compress with the most performant columnstore
compression), COLUMNSTORE_ARCHIVE (further compress the table or partition to a smaller size)

 UpdateTableSchema – 0: Don't update, 1: (Default) Update table schema by adding columns


missing from source tables. It will issue alter statements

 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

 OnlineProcess – defines if online processing should be enabled. Default value is 0, that means
regular batch ETL

 PartitionSchemeName - Name of the partition scheme. Should be left null for default one. The
database administrator can use their own existing partition scheme if other than the default one

Page 86 | 335
Advanced Analytics Platform Technical Guide

 DeclareFailureUponUnsafeLanding – Allows to stop ETL if InsightLanding update fails. It


defaults to 0 so 'show must go on' even if IsSafelyLanded = 0 exists in the controling work table;
otherwise, a runtime error will be raised at the end

dbo.s_DQSummary_Load
Description
This stored procedure deletes the content of the DQSummaryLog table and populates it with new DQ-
related statistics.

Inputs
 @MIS_Date – Stores the Current ETL Date

dbo.s_InsightLanding_CSI_Views_Create
Description
Removal of dated schemas means all daily loads of data from source system will be stored in a single table.
Also, the schema where tables and views are contained never changes once created.

Core Banking multi-value fields can be larger than the biggest supported string type (NVARCHAR4000) and
would need to be stored as MAX data type which cannot be kept in columns store index. When data load
encounters column values greater than the largest supported type a process is triggered to move the
column to row store (MaxString tables). This store procedure creates reporting views by joining both the
base tables and the lookup table making the split seamless to the end user.

Since the schemas in Columnstore index remains for the most part static these views rarely need to be
recreated with the exception of those cases when new columns are added or a string in column grows over
the biggest supported string type on the source system being loaded.

The data in InsightLanding can now be queried as follows:

select Column1, Column2, ….ColumnN from InsightLanding.BS.v_AA_ACCOUNT_DETAILS


where mis_date = '2017-12-31';

Note that only columns needed should be returned in the result set.

Steps
20. Called by s_InsightLanding_CSI_Table_Update.
21. Creates a temp dataset using metadata with list of columns for table. If column for table is in
MaxStringCounter lookup table then a function is used to replace string with hash table values with
original raw value from lookup table
22. Drops view if already exist
23. Create a view based on the list of columns in temp dataset.

Page 87 | 335
Advanced Analytics Platform Technical Guide

Inputs
 TableName – The table for which a view must be created
 SourceName – Used to pass the name of the source system to be imported into Insight Landing.
This must match one of the Source Names defined in the Extract List table. Example: BS
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

dbo.s_InsightLanding_CSI_Table_Reset
Description
This procedure is used to reset Clustered Columnstore Index in InsightLanding back to is original state
when first deployed.

The schemas definitions in Columnstore Index are very static thus if a new source data set is to be loaded
that has changed significantly from its first loads it is best to start fresh to avoid any potential conflicts with
data. This is the case when data from a different Temenos Core Banking environment is extracted where
STANDARD.SELECTION metadata definitions may be different.

Note that this store procedure is to be run during the implementation of the solution as it will delete all
historical data loaded in Landing.

Steps
1. Builds a temp dataset with list of tables and views to be dropped

2. Deletes all views for schema passed in parameter

3. Removes all tables


4. Resets the FlexStringLoad flag in ExtractList

5. Deletes records from MaxStringCounter and MaxString tables

6. Resets SourceDate table in InsightETL

Inputs
 Source System – The name of the source system that you are resetting Columnstore index data
for. This must match one of the source system names defined in Extract List. Default value is “All”
 TableName – The table for which a view must be created. Default value is “All”
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Page 88 | 335
Advanced Analytics Platform Technical Guide

dbo.s_InsightLanding_CSI_Purge
Description
This stored procedure is used to purge InsightLanding data within a certain date range when columnstore
index (CSI) is in place and to rebuild the CSIs for the remaining data. It replaces the previously used
s_InsightLanding_Purge and s_InsightLanding_RangePurge stored procedures that can purge tables when
only rowstore indexes are in place.

s_InsightLanding_CSI_Purge can be used on an individual table or on all the tables landed from a specific
source system. The procedure first populates the ##LandingWorkTableCsiPurge temporary table with all
the T-SQL statements required for purging the tables selected and, in case, for rebuilding their indexes,
then it executes these statements and updates the logs. In fact, s_InsightLanding_CSI_Purge can run in a
purging or “Review only” mode: the former means that the stored procedure will actually delete data from
the tables selected, while the latter means that the procedure only updates ##LandingWorkTableCsiPurge
without actually executing the statements that are stored in it.

The major R19 improvement of s_InsightLanding_CSI_Purge is that this stored procedure now uses multi-
threaded deletion per table, i.e. each of the threads assigned to s_InsightLanding_CSI_Purge will delete
the content of a certain table, so that multiple tables can be purged in parallel. T-SQL statements for index
rebuilding, instead, are multithreaded internally by SQL engine hence no further Temenos-designed
multithreading is necessary. Another important enhancement within this stored procedure is that
s_InsightLanding_CSI_Purge executes data deletion in chunks of 100,000 rows to improve multi-threaded
processing performances. Multi-threading is, in fact, a resource-intensive operation – therefore, the
deletion-by-chunks within each thread helps to reduce the overall resource consumption. The most
apparent direct result of this features is that it effectively limits the consumption of the temporary space
for SQL transactions, with almost equal or sometimes even faster speed. Furthermore, it reduces the disk
space used and the size of the log file.

It should be noted that s_InsightLanding_CSI_Purge always relies on the default compression level from
the first partition considered and does not support using different compression levels on multiple partitions.
Furthermore, even though there are normally two available approaches to defragment indexes in SQL i.e.
Rebuil or Reorganize, Rebuild is the only option offered by default in the current stored procedure, as this
option has proven to perform faster. Locally developed statements that use the reorganize option, however,
can be designed by setting the @OutputForReviewingOnly input parameter to 1 and by subsequently
modifying the standard index rebuild statements stored in the ##LandingWorkTableCsiPurge temporary
table that will be discussed in the next section.

Global Temporary Table


As mentioned in the description, s_InsightLanding_CSI_Purge relies on a global temporary table called
##LandingWorkTableCsiPurge11. This table is populated with a full log of all the T-SQL statements to be
executed during the purging process and monitors the execution of these statements by the assigned
threads. If index rebuilding is required, the table will also contain all the T-SQL statements used for this
purpose.

This table is dropped and repopulated whenever s_InsightLanding_CSI_Purge starts and dropped again
once all statements in it are completed successfully, unless s_InsightLanding_CSI_Purge is executed in
“Review Only” mode.

11
##LandingWorkTableCsiPurge is the name of the table in a single tenant environment. In a multi-Tenant
installation, the table name will change depending on the Tenant's name, as for the databases’ and agent
jobs’ names, and it will follow the ##TNT_LandingWorkTableCsiPurge pattern, where TNT is the tenant’s
name e.g. ##Tenant1_LandingWorkTableCsiPurge.

Page 89 | 335
Advanced Analytics Platform Technical Guide

The ##LandingWorkTableCsiPurge temporary table contains the following parameters:

Column Name Description


RowID Clustered identity primary key.
ActionQuery T-SQL statement to be executed by the stored procedure that
can either delete a set of rows in a table (purging statement)
or rebuild an index (rebuild statement).
Target Target of the ActionQuery statement, if any
Source Source of the ActionQuery statement, if any
SchemaName Schema of the table to be processed
TableName Name of the table to be processed
CSIndexName Name of the CSI Index to be processed
CSICompressionLevel CSI Compression Level

Steps
 Checks batch-related inputs and starts the batches required to manage processing
 Validate other input parameters
 Gets the most recent date landed from the appropriate <Source System>.SourceDate table
 Drop the content of ##LandingWorkTableCsiPurge, if any
 Log the T-SQL statements and the other attributes of the new purging process to
##LandingWorkTableCsiPurge. The statements will be used to:
 Retrieve the default values of purging dates range from
InsightLanding.dbo.SystemParametersLanding. The general purging dates range
defined by this table can be overwritten if a value exist in the PurgeOlderThan
column of dbo.ExtractList for the specific table to be purged.
 Select the Compression Level from the first partition as the new compression level
for rebuild
 Delete data from appropriate tables in chunks of 100K rows each and update the
row count
 Rebuild the columnstore index if required
 Execute tables’ data deletion in parallel
 Log known error(s) including possible threading error(s) caused by the ActionQuery submitted and
unexpected CLR error
 Log the processing detail for each table
 Remove dates within the purging range from the InsightETL.dbo.SourceDate, if asked to do so in
the @PurgeSourceDate input parameter and only if CLR ran deletes successfully
 Flag dates as purged in the InsightLanding.<Source System>.Sourcedate
 Rebuild the columnstore index if asked so in the @RebuildCSI input parameter
 Drop the content of the ##LandingWorkTableCsiPurge global temporary table
 Update dbo.EventLogDetails and finish (batch stopped)

Inputs
The s_InsightLanding_CSI_Purge procedure includes the following input parameters.

Page 90 | 335
Advanced Analytics Platform Technical Guide

 SourceName - The value of this input parameter, that uses data type NVarChar(128), is the name
of the source system that is used as table schema name, such as: 'BS' or 'CRM'
 KeepMonthEnd - This input parameter defines if you wish to keep month-end data for the table
or tables you are about to purge. Acceptable values are 0 or 1 (1 Bit data type).If set to 1, the
purge stored procedure will delete all the content for the selected tables in the selected dates range
except for the month's ends. If set to 0, the month-end data will also be deleted, too.
 TableToPurge - This input parameter, that uses data type NVarChar(128), contains the name of
the table to be purged without the schema name or the keyword 'ALL' to signify that all tables
should be purged. The default value is ‘ALL’.
 PurgeSourceDate - This input parameter defines if dates within the purging range will be
removed from the InsightETL.dbo.SourceDate. Acceptable values are 1 or 0. When this parameter
is set to 1, the dates affected by the purging will be removed by the InsightETL.dbo.SourceDate
table. When 0, is selected, instead the hystorical information will be kept in
InsightETL.dbo.SourceDate. The default value for this parameter is 1.
 RebuildCSI - This input parameter defines if the columnstore indexes will be rebuilt at the end of
the purging process. This compresses the rowgroups and improves overall query performance.
Acceptable values for this parameter are 1 (i.e. the procedure rebuilds the columnstore index after
rows have been deleted) or 0 (i.e. indexes are not rebuilt) and the default value is 1.
 BatchNum - This input parameter stores the current batch number and its default value is null. If
null, the stored procedure will automatically select the most recent active batch from
InsightETL.dbo.Batch and assign this to the thread
 TotalThreads - This input parameter stores the total number of threads used by the procedure
and it is automatically determined if the values assigned is null or less than 1 (i.e. the default); in
this case, server's maximum number of schedulers will be used as the number of worker threads
 OutputForReviewingOnly - This input parameters determines if the stored procedure is
executed for purging tables or to just review its potential results. As previously highlighted, in fact,
the purging stored procedure stores all statements to be run a temporary table called
##LandingWorkTableCsiPurge before they are executed. @OutputForReviewingOnly will control
whether the purging stored procedure stops at the step in which ##LandingWorkTableCsiPurge is
populated or continues and executes the statements logged in this temporary table. Acceptable
values for this input parameter are 1 or 0. If 1 is selected, the stored procedure will not execute
the T-SQL delete statements, and will not rebuild the columnstore indexes. Instead, the review
contents produced are output to a global temporary table for reviewing purposes. If the value
selected is 0, data deletion and index rebuilding are executed as per the parameters above.

It should be noted that s_InsightLanding_CSI_Purge does not include any input parameter to define the
dates range to be considered for the purging as this can be parameterized globally in
v_SystemParametersLanding in InsightETL as part of the Data Manager rule definition or specified in the
dbo.ExtractList configuration table for individual tables. As mentioned in the steps section, values defined
in ExtractList will supersede the definition in v_SystemParametersLanding.

dbo.s_ColumnStoreIndex_Defragmentation (Lives in InsightETL database)


Description
Columnstore index can get fragmented like any other index. Microsoft recommends to reorganize a
columnstore index after one or more data loads to achieve query performance benefits as quickly as
possible.
Page 91 | 335
Advanced Analytics Platform Technical Guide

This procedure is used to force all of the rowgroups into the columnstore, and then to combine the
rowgroups into fewer rowgroups with more rows. The ALTER INDEX REORGANIZE online operation also
removes rows that have been marked as deleted from the columnstore index tables.

Clustered Columnstore indexes will be reorganized in a database when any of the following criteria is met:

 Compress all open rowgroups if rows in individual rowgroup or sum of rows for index is greater
than supplied value
 Reorganize if rowgroup less than supplied value
 Reorganize when more than supplied % of rows have been deleted
 Reorganize when any segments contain more than supplied % deleted rows
 Reorganize if more than supplied number of segments are empty

Steps
1. Create temp metadata of all clustered columnstore indexes in database

2. Create temp metadata of all OPEN rowgroups that met condition to be compressed
3. Create temp metadata of COMPRESSED rowgroups that met condition to be merged

4. Create temp metadata of rowgroups where deleted rows have met conditions to be removed

5. Issue ALTER INDEX REORGANIZE to compress all open row groups or to merge compressed row
groups
6. Issue ALTER INDEX REORGANIZE to remove deleted rows in the compressed row groups

Inputs
 DatabaseName – Columnstore Index has been implemented in InsightLanding and
InsightWarehouse databases. Specify either database name

 PartitionNumber – Partition Number

 CompressRowGroupsWhenGT – Compress all open rowgroups if rows in individual rowgroup


or sum of rows for index is greater than supplied value. Default is 5,000 rows
 MergeRowGroupsWhenLT– Reorganize rowgroups when a rowgroup is less than supplied value.
Default is 150,000 rows
 DeletedTotalRowPercentage – The overall percentage of deleted rows has exceeded a cutoff.
Default is 10%

 DeletedSegmentsRowPercentage – The percentage of rows in any individual segment


exceeds a cutoff. Default is 20%
 EmptySegmentsAllowed – The number of segments that are completely empty (all rows have
been deleted) has exceeded a cutoff. Default is 0 (none)
 ExecOrPrint – Execute or Print commands to reorganize columnstore indexes. Default is Exec

 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Page 92 | 335
Advanced Analytics Platform Technical Guide

dbo.s_InsightLanding_Purge (non-CSI)
Description
This procedure is used to purge any tables from InsightLanding that exceed the number of dates that
should be stored as configured in the Extract List table in the Purge Older Than column starting from the
date provided as an input parameter. From R18, this stored procedure will only be used if a client decides
not to avail of Columstore indexes (CSI).

Once all tables for a particular Landing schema have been purged then the schema itself is removed.

Steps
7. Find schema for date if it exists

8. Remove all tables from schema where range from schema date to purge date exceeds purge older
than value
9. Remove schema if no tables are left in schema

Inputs
 Source System – The name of the source system that you are purging data for. This must match
one of the source system names defined in Extract List.

 Date – Date that should be included in the check to see if dates should be purged

dbo.s_InsightLanding_RangePurge (Non-CSI)
Description
This procedure is used to purge any tables from InsightLanding that is older than a certain date (defined
in the PurgeOlderThan column of the in the ExtractList table), by internally calling s_InsightLandingPurge.
In addition to this, s_InsightLandingRangePurge also allows you to pass a range of dates which it will loop
through, using each date to compare if any table extracts are older than their defined retention period.

Any dates older than the start date parameter that is passed to this procedure will be excluded from the
check and will not be removed during the purge process. This is done in case a bank needs to retain non-
month-end daily extracts that exceed the range set in the PurgeOlderThan column in ExtractList.

From R17, the procedure that purges Landing data will use the new BSEndofMonth flag in
InsightStaging..BSSourceDate instead of the calendar month-end, when purging daily data. From R18, this
stored procedure will only be used if a client decides not to avail of Columstore indexes (CSI).

Once all tables for a particular Landing schema have been purged then the schema itself is removed.

Steps
1. Verify if date is a month end date and if it should be retained or not
2. Exec s_Landing_Purge to drop all tables where PurgeOlderThan value in ExtractList is not
null and the difference between the latest date in Landing and the current date being processed is
greater than the PurgeOlderThan value.
3. If all tables in the schema have been dropped then drop the schema

Page 93 | 335
Advanced Analytics Platform Technical Guide

4. Repeat for all dates in the specified range

Inputs
 Start Date - Earliest date that should be included in the check to see if dates should be purged.
Any date older than this date will be excluded from the check and retained. Pass a valid date format
that is correct for the collation being used. Ex YYY-MM-DD for default collation.
 End Date – The most recent date that should be included in the check to see if dates should be
purged. Pass a valid date format that is correct for the collation being used. Ex YYY-MM-DD for
default collation.

 Delete_EndOfMonth – If set to Yes then month ends will not be excluded from the purge check
and will be deleted if they exceed the retention period defined in Extract List. This parameter
defaults to No. Syntax: 1 for Yes, 0 for No.
 SourceSystem – The name of the source system that you are purging data for. This must match
one of the source system names defined in Extract List.

dbo.s_InsightLanding_Compression
Description
This procedure compresses the InsightLanding database as part of regular maintenance. The first time
s_InsightLanding_Compression is executed, this is done for all tables in all schemas but, since release 2017,
the store procedure will only compress tables that have not previously been compressed, thereafter. Page
compression is used for the data compression type.

This compression stored procedure is provided out-of-the-box with any Analytics platform, however, it
needs to be enabled and scheduled. It is recommended to run it at least on a weekly basis.

Steps
1. Rebuild all tables in schema with data compression set to page
2. Repeat for each schema
Inputs
 Enable - If this input parameter is set to 1, the stored procedure will compress table at the page
level (DATA_COMPRESSION = PAGE). This input parameter can be set to 0 to disable compression
on a table

dbo.s_InsightLandingTable_Update
Description
This procedure was used in R17 and earier to bring in new source data into InsightLanding or to update
existing source data already stored in Landing within the Analytics ETL – it is still currently available also
for financial institutions using R18 that for any reason wish to use InsightLanding without the Columnstore
index option. The procedure needs to be called for each source system whose columns are archived in
InsightLanding.

Page 94 | 335
Advanced Analytics Platform Technical Guide

dbo.s_InsightLandingDateSchemaCombineViews_Create
Description
Stored procedure used in R17 and earlier releases to create combined views from daily loads of data from
source systems are stored in their own SQL schemas

Online. s_InsightLanding_Online_Table_Update
Description
This procedure is used to copy data from the InsightImport Online schema tables into the InsightLanding
Online Schema tables. It is called by the Online.s_ProcessImportOnline_Update procedure.

Inputs
 Source Name – Used to pass the name of the source system to be process. In this case it is BS
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

Online. s_InsightLanding_Online_Views_Create
Description
This procedure creates the InsightLanding Online views. The views will be a union of intra-day and previous
business date batch. It is called by the Online.s_InsightLanding_Online_Table_update procedure.

Inputs
 Source Name – Used to pass the name of the source system to be process. In this case it is BS

 Table Name – Name of table for which a view is to be created. Acceptable values are the table
name or the keyword ‘ALL’
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Online. s_InsightLanding_Online_EOD
Description
This procedure processes the Online EOD. It uses the InsightLanding Online views to copy data to the BS
Landing schema tables in preparation for the batch ETL end of day. It is called by the
Online.s_ProcessImportOnline_Update procedure.

Inputs
 Source Name – Used to pass the name of the source system to be process. In this case it is BS

 Table Name – Name of table for which a view is to be created. Acceptable values are the table
name or the keyword ‘ALL’

Page 95 | 335
Advanced Analytics Platform Technical Guide

 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

Functions
dbo.fn_DataTypeOrder
Description
This scalar-valued function determines a ranking number for a data type compared to the other data types.
Specifically, this function returns the order of the size of a data type so that columns can be ordered by
size and so that a size can be used that will not result in any implicit casting issues.

The idea is that a column with a lower ranking data type can be inserted into a column with a higher
ranking data type.

Steps
1. Rank data type based on the table below.

DataType Rank
bit 1000
tinyint 2000
smallint 3000
int 4000
real 5000
smallmoney 6000
money 7000
decimal 7500
float 7600
smalldatetime 8000
datetime 9000
datetime2 9500
char% 100000
nchar% 200000
varchar% 300000
nvarchar% 400000

Inputs
 DataTypeName – The name of the data type. Eg. Int.
 MaxLength – The length of the data type Eg. 30.
 Precision – the number of digits in a number.
 Scale – number of digits to the right of the decimal point in a number

Page 96 | 335
Advanced Analytics Platform Technical Guide

dbo.fn_FullDataType
Description
This function returns a full data type eg. nvarchar(120) given base type eg. nvarchar and max length.

Steps
 Determine full datatype, eg. Nvarchar(30) given Nvarchar and 30.

Inputs
 DataTypeName – The name of the data type. E.g. Int.
 MaxLength – The length of the data type E.g. 30.
 Precision – as described previously
 Scale– as described previously

dbo.fn_DeFlexStringCLR
Description
This is a scalar-based function used to import MaxString content into InsightLanding views.

Configuration
Configuring ExtractList
The ExtractList table must be populated with a record for every source table that needs to be imported into
Landing. Any table that you need to bring into InsightWarehouse for analytical reporting and ad hoc analysis
needs to have a record in here. In addition to this, any source table that you want to store over time in
Landing to be used for reporting or another 3rd party system table needs to have an entry in Extract List.

By default, a certain pre-configuration for the Extract List table will be included in the Advanced Analytics
Platform. This configuration will handle a completely unmodified Temenos Core banking Model Bank
configuration. Minor exceptions are the RE_CRF_GL and RE_CRF_PL tables, which are often renamed per
Core banking installation and will have to be renamed in Extract List back to RE_CRF_GL and RE_CRF_PL.
The default configuration will assume these tables are titled RE_CRF_MBGL and RE_CRF_MBPL so you will
have to replace these values with your bank’s custom table names.

If core fields from any of these tables have been moved to a local table in your installation of Core banking,
then you will need to create a record to import those local tables. This is assuming this local table has been
configured in DW.Export and InsightImport. Also, any additional tables that have been added to
InsightImport or need to be imported from another 3rd party source system will need to be added.

In Extract List, you will need to list out all tables in InsightImport including the additional tables created as
part of the local ref, multi-value, and sub-value processing. You will notice some of these tables already
included in the default configuration for Temenos Core banking Model Bank.

You can enable or disable online processing for a table. You can disable the import of tables that you no
longer need or are not available but this is not mandatory. The Landing update procedure will not fail due
to missing tables and will carry on.

Page 97 | 335
Advanced Analytics Platform Technical Guide

InsightSource
Overview
InsightSource is used to combine source data from multiple source systems that may have different import
frequencies or source dates. It takes source system data with differing extract frequencies and makes one
combined copy of the most up to date data from each source system. This eventually becomes one business
date in InsightWarehouse.

Even if it is dealing with different import frequencies or source dates, InsightSource will treat data copied
from the InsightLanding database as if it were only one day of data. The content of InsightLanding tables
will be stored into InsightSource tables having the same name but with the date removed from the schema
(e.g. BS.CUSTOMER, BS.ACCOUNT, Budget.GLBudget etc.). Tables are stored in InsightSource grouped in
separate schemas for each source system (e.g. BS, Budget etc.).

Furthermore, transactions from different dates in Landing are combined here. E.g. if the Analytics ETL
process requires more than one day of transactions, the number of days to combine can be specified in
the ExtractList table in InsightLanding.

This database is volatile so, every time Analytics ETL is run, the data in it will be replaced with the latest
copy of source data from InsightLanding. InsightSource only stores one business date at a time, as
previously stated, so only the current business date being run will be populated.

From R17, ETL batch control and an upgraded multithreading process have been added to the process
updating this database and all logging has been redirected to dedicated tables in the InsightETL database.

Specific Features / Functions


Feature Description
Combining Multiple Source Dates into When updating InsightSource for a particular business date it
Single Business Date will grab the latest source extracts from landing with the most
up to date data for that business date and combine then into
one common source.
Source Schema Synchronization This feature is used when source data has changed and the
bank may need to process both the historical data before the
change and the current data after the change. This allows you
to take into account both different mappings without the need
to manually edit the source view mapping when you process
dates on either side of the change.

Page 98 | 335
Advanced Analytics Platform Technical Guide

Technical Details
Architecture
In the figure below, we can see how the InsightSource database fits in the Advanced Analytics platform’s
architecture.

Figure 16 - InsightSource in the Core Analytics ETL Flow

Page 99 | 335
Advanced Analytics Platform Technical Guide

Technical Components
SQL Stored Procedures
s_InsightSource_CSI_Update
Description
This procedure is used to import data from InsightLanding to InsightSource for one source system at a
time, with the most up to date data available from each source system being loaded into InsightSource.

The selection of the latest date for the business entity (i.e. Entity Date) requires a SQL statement in the
v_systemParameters view to determine which Landing date should be used for data processing.

Steps
1. Determine correct source date for the business date being processed
2. Check if data is available in InsightLanding for the source date
3. Remove all InsightSource object if all objects are being processed
4. Copy all source tables into InsightSource
5. Repeat for each source system
Inputs
 Sources – The source system of the tables to be processed from InsightLanding (e.g. BS). One
source system schema per time should be used and null values are not acceptable.
 BsDate – The business date currently being processed (see Configuration section)
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

s_InsightSource_CSI_UpdateAll
Description
This procedure is used to build a single set of data from all source systems with the most up to date data
available from each source system. To do so, this stored procedure internally calls s_InsightSource_Update
and uses the source dates defined in InsightETL for each source for the current business date being
processed.

InsightSource is volatile, so all tables in it are dropped and recreated for each full ETL run.

Steps
1. Exec s_InsightSource_Update for first source system
2. Determine correct source date for the business date being processed
3. Check if data is available in InsightLanding for the source date
4. Remove all InsightSource object if all objects are being processed
5. Copy all source tables into InsightSource

Page 100 | 335


Advanced Analytics Platform Technical Guide

6. Repeat for each source system

Inputs
 BsDate – The business date currently being processed. This date is used to look up the respective
source dates from the InsightETL Source Dates table.

s_InsightSource_Update (deprecated)
Description
This procedure was used, in R17 and earlier releases, to import data from InsightLanding to InsightSource
for one source system at a time, with the most up to date data available from each source system being
loaded into InsightSource.

This stored procedure used the source dates defined in InsightETL for each source for the business date
being processed. The InsightSource database is volatile so all tables in it were dropped and recreated for
each full Analytics ETL run.

Steps
1. Determine correct source date for the business date being processed
2. Check if data is available in InsightLanding for the source date
3. Remove all InsightSource object if all objects are being processed
4. Copy all source tables into InsightSource
5. Repeat for each source system
Inputs
 Sources – The source system of the tables to be processed from InsightLanding (e.g. BS). One
source system schema per time should be used and null values are not acceptable.
 BsDate – The business date currently being processed. This date is used to look up the respective
source dates from the InsightETL Source Dates table.
 BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function
 TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs)

s_InsightSource_UpdateAll (deprecated)
Description
This procedure was used, in R17 and earlier, to build a single set of data from all source systems with the
most up to date data available from each source system. To do so, this stored procedure internally called
s_InsightSource_Update and used the source dates defined in InsightETL for each source for the business
date being processed.

InsightSource is volatile, so all tables in it were dropped and recreated for each full ETL run.

Page 101 | 335


Advanced Analytics Platform Technical Guide

Steps
1. Exec s_InsightSource_Update for first source system
2. Determine correct source date for the business date being processed
3. Check if data is available in InsightLanding for the source date
4. Remove all InsightSource object if all objects are being processed
5. Copy all source tables into InsightSource
6. Repeat for each source system
Inputs
 BsDate – The business date currently being processed. This date is used to look up the respective
source dates from the InsightETL Source Dates table.

s_InsightSource_Synchronize_Schema
Description
This procedure is used when a data change that affects mapping has been made and there is need to
process data both before and after the change. Usually, this would require the bank to change the mapping
involved depending on which date is being processed. Instead, code is added to this procedure to account
for the change and reflects that in the source tables before mapping into InsightWarehouse occurs in
InsightStaging.

This procedure is empty, to begin with and custom code will need to be added to deal with the change in
data. A simple example of where this may be employed is when mapping has changed as in a core banking
system field has been moved or added. If a field has been added (and included in the source data mapping
in InsightStaging) then reprocessing any date before this change will fail as the mapping will have been
updated to accommodate this new field.

This procedure is not being called by the default InsightETL job and will need to be added once it is
required.

Configuration
InsightSource does not require any direct configuration and hence there are no configuration tables
present. However, it requires that valid source dates are already entered into InsightETL for the current
business date being processed. This update should be included as part of the InsightLanding procedures
or through a manual edit of the Source Date table in InsightETL.

InsightSource also requires that the data for each source date for each source system is present in
InsightLanding.

Source Date Selection


The selection of the latest date for the business entity (i.e. Entity Date) in s_InsightSource_CSI_Update
requires a SQL statement in the v_systemParameters view to determine which Landing date should be used
for data processing. This SQL statement will be assigned Type = SourceData and Name = 'DateSelection'
in the view. The possible values of this entry can be 'EntityDateSQL', 'LastMonthDate', 'PreviousDayDate',

Page 102 | 335


Advanced Analytics Platform Technical Guide

'MonthEndDate' and 'CustomDateSQL' – if the DateSelection entry is not specified, the latest SourceDate
in InsightLanding will be taken in consideration12.

 The 'EntityDateSQL' assumes that COBs are run on different days but it is preferable to run Analytics
ETL everyday with the most recent data of each entity. The latest SourceDate selection is the default
and all data is in the Landing table for this business date
 ‘LastMonthDate’ uses the last date of the previous month that has data
 ‘PreviousDayDate’ uses 1 day before the current date
 ‘MonthEndDate’ uses the last date of the current month
 ‘CustomDateSQL’ requires a SQL statement in SystemParamters to find the date for a source
We should check the v_systemparameters view or its corresponding table to verify what sort of date
selection will be used before we execute the stored procedure. The latest SourceDate will be used if no
'DateSelection' entry is defined in v_systemParameters

Running the SQL Stored Procedures


The Analytics ETL Job is a SQL Agent Job which is configurable to execute to specific workflows that can
change depending on the particular client. It will be configured mostly to accept different procedure
parameters but may have additional workflow built in.

InsightSource can be configured to run as part of the Analytics ETL SQL agent job or ran manually via the
stored procedures and their parameters. The required stored procedure as part of the ETL job can be either
of the two listed below.

Analytics ETL Procedure

 s_InsightSource_CSI_UpdateAll (bsDate)
 s_InsightSource_CSI_Update (Sources, bsDate, BatchNum, TotalThreads)
The first option is selected if we want to have all tables for all source systems imported into InsightSource
at the same time. The second option is used if we want to have separated import for separated source
systems, e.g. to ease logging and troubleshooting. For example, if we have BS and Budget source systems
and we want to have separated InsightSource updates we can use the following code split into two steps:

USE InsightSource

declare @CurrentETLDate date = (select max (BusinessDate) from InsightETL.dbo.CurrentDate);

EXEC dbo.s_InsightSource_CSI_Update @sources = 'BS', @BSDate = @CurrentETLDate, @BatchNum =


null, @TotalThreads = null

declare @CurrentETLDate date = (select max (BusinessDate) from InsightETL.dbo.CurrentDate);

EXEC dbo.s_InsightSource_CSI_Update @sources = 'Budget', @BSDate = @CurrentETLDate,


@BatchNum = null, @TotalThreads = null

12 SQLEntityDate uses Landing views, all other options use Landing tables for the select into source data.
Page 103 | 335
Advanced Analytics Platform Technical Guide

Page 104 | 335


Advanced Analytics Platform Technical Guide

InsightETL
Overview
InsightETL database includes the backend tables used for the definition of business rules to categorize
data, create bands, create datasets, custom tables and perform calculations, the control of logging tables,
performance enhancements and multi-threading. In addition to this, InsightETL has several additional
features that will be discussed below.

This database replaces both the InsightMasterData and Insight databases, that were present in pre-R18
releases, and incorporates all their functionalities.

Rules Engine
GDPR
The General Data Protection Regulation (GDPR) (EU) 2016/679 is a regulation in EU law on data protection
and privacy for all individuals within the European Union (EU) and the European Economic Area (EEA). It
also addresses the export of personal data outside the EU and EEA areas. The GDPR aims primarily to give
control to citizens and residents over their personal data and to simplify the regulatory environment for
international business by unifying the regulation within the EU. This regulation grants customers the right
to have their details erased from a bank’s database, including the Advanced Analytics Platform’s. A new set
of configuration tables have been included in InsightETL to configure GDPR erasure and a dedicated chapter
of this document will discuss this new feature thoroughly.

Data Quality
As previously discussed in the chapter dedicated to the InsightImport database, Data Quality Import (DQI)
is a new feature added to Analytics ETL and Process Data ExStore Import in R18 release. Its smallest
operation unit is at field level. Once configured, tables in InsightImport are subject to data quality check;
and if necessary, in-place correction will be made to the fields according to DQI rules.

Except for the hard-coded Failover rules, DQI rules are defined in the InsightETL.dbo.RuleDataQuality table.
This table, together with all the new DQI-related functions, will be discussed further on in this chapter.

Logging, Toolkit and Performance Enhancement


The InsightETL database also incorporates, from R18, the functionalities and features previously defined in
the now deprecated Insight database. Therefore InsightETL is used to store some of the procedures and
functions used for the enhanced multi-threading and logging processes.
Furthermore, the InsightETL database is used to store a toolkit of the procedures, functions and
configuration tables used for Analytics ETL performance enhancement. Specifically, this database includes
utility stored procedures which do string operations such as split and substring before the tables are joined
in the v_source views, improving the parsing of Core banking string used for keys and the performances
of the source to target mapping.
The string operations toolkit provided in InsightETL helps to solve the problems generated by the non-
relational format of the core banking data and by the time-consuming and complex process of format
transformation performed in the first part of Analytics ETL. The inclusion of CLR routines in the
InsightImport stored procedures in charge of this and the subsequent enhancement of the multi-threading
process have partially solved this issue, as explained further on in this chapter. However further
improvements can be achieved through the toolkit in InsightETL.

Page 105 | 335


Advanced Analytics Platform Technical Guide

The toolkit can also be used to control how temporary tables are materialized from v_source views during
the Extract and Transform phases of Analytics ETL. Even in this case, its purpose is to boost performance
in various views.
In addition to these toolkit functionalites, InsightETL serves as centralized logging location, where a single
table stores information about any Analytics ETL process. Secondly, this database hosts the detailed logging
table used by multi-threaded stored procedures in InsightImport, Landing and Source. To end with, it also
stores the two separated detailed logging tables used by the core ETL processing in the InsightStaging
database and relying on a different type of multi-threading.

Multi-threading
As previously mentioned, all core ETL stored procedures in InsightImport, InsightLanding, and
InsightSource internally all invoke the s_ParallelExecActionQueriesTx stored procedure in the Insight
database, which is the heart of the new parallelization process.
s_ParallelExecActionQueriesTx is a SQL CLR procedure which orchestrates multi-threading and centralized
logging in Analytics for the three databases above. The outcome of the logging process, however, is not
stored in the Insight database but in the EventLog and EventLogDetails tables of another database called
InsightETL, which is also discussed in this technical guide. We will see more detailed information about this
stored procedure as we proceed.

Multi-language reports configuration


From R18, the Advanced Analytics Front end supports the use of multi-language reports. Specifically, the
Translation table now includes a new column (i.e. Language) used for the configuration of Multi language
reports. Transaltion was used, even in earlier releases, to assign report lables to tables’ attributes when
these attributes are used within a report – from R18 onward, each attribute can have multiple labels
depending on the language assigned.

Specific Features / Functions


Feature Description
Add hierarchies to N number of lookup tables can be easily added, their lookup tables can be
extracted data used to add or update additional columns in Staging, Dim and Fact tables
in the InsightStaging Database, or in tables in the InsightLanding and
InsightSource databases or in abstraction views in the InsightWarehouse
database.
Add bands (range N number of lookup tables containing Bands can be maintained, these are
groupings) to extracted used to add or update columns used to represent bands in Staging Tables,
data or in tables in the InsightLanding and InsightSource databases or in
abstraction views in the InsightWarehouse database.
Update values in Staging N number of Update statements can be stored in InsightETL, these can be
Tables used to add or update columns in Staging or Fact tables. Tables or views
can be set up to support these customizations.

Manages Datasets N of business rule can be easily added to set the values of multiple fields.

Page 106 | 335


Advanced Analytics Platform Technical Guide

Values Split A table is maintained that allows to define column splits

Data Quality Import A table is maintained to define rules for Data Quality Import
Import Configuration

Maintain System ETL A record of the current ETL date and dates for previous ETL runs per
Dates source system are maintained in InsightETL.
Defines report label and A table is maintained to define attributes labels in reports and can be
enables multi-language configured for multi-language settings
Centralized Logging InsightETL stores the EventLog table which provides centralized logging for
any ETL Analytics process in any database.
In addition to this generic logging table, InsightETL hosts EventLogDetails
table. This is where the stored procedures and functions responsible for
enhanced multi-threading will store a very detailed log for each task
initiated and/or completed by a thread, within the initial part of Analytics
ETL in InsightImport, Landing and Source. This database also hosts the
logging tables for Batch control.
InsightETL also contains StagingEventLog and StagingEventLogDetails
table. These are the tables where the stored procedures and functions
performing the Extraction, Transformation and Loading of InsightStaging
and InsightWarehouse tables will store a detailed log for each task initiated
and/or completed by a thread.
The EventLog table is referenced by EventLogDetails and also by
StagingEventLog.

Performance InsightETL hosts a number of configuration tables for the indexing of


Enhancement Toolkit source to target mapping and parsing of Core banking compound keys in
Analytics ETL.
GDPR Erasure InsightETL stores ad hoc technical components for the configuration of
configuration consent management and personal data erasure according to the General
Data Protection Regulation

Technical Details
Architecture
InsightETL business rules are incorporated into the Analytics ETL data flow during the Extract and
Transform phases of the core Analytics ETL, which is carried out in the InsightStaging database.

InsightETL updates are called in the following instances.

InsightImport rules – as per what specified in the rule definition

InsightLanding rules – as per what specified in the rule definition

InsightSource rules – as per what specified in the rule definition

InsightStaging rules – as per what specified in the rule definition but normally according to the following
principles:

Page 107 | 335


Advanced Analytics Platform Technical Guide

 Dimension Data
o At the end of the extract – after source tables have been merged into the staging table
o At the beginning of the transform – before Dim table is created
 Fact Data
o At the end of the transform – after Fact table is created

InsightWarehouse rules – as per what specified in the rule definition

These updates are driven by the Rule engine-related tables (more details will be provided later in this
section).

Figure 18 - InsightETL in the core Analytics ETL Dataflow

Page 108 | 335


Advanced Analytics Platform Technical Guide

Technical Components
InsightETL consists of tables, views and stored procedures critical that add data, customize and control the
Analytics ETL process.

Tables
dbo.RuleDefinitions
This table stores definitions for business rules for each installation. It replaces the now deprecated
InsightMasterData.dbo.CustomColumn table that was used to manage customization rules in pre-R18
releases.

RuleDefinitions has the details of the column(s) being added and key columns that they are based on. E.g.,
if a new Product hierarchy column can be based on a composite of product code columns. It also stores
custom SQL code that can be inserted into the Analytics ETL process.

As previously mentioned, there are five types of rules that can be added to a database using Data Manager,
i.e. Lookup, Banding, Calculation, Dataset and CustomTable. The rule type is specified in the Operation
column of the RuleDefinitions table.

The RuleDefinitions table has the following columns:

Column Name Description


RuleDefinitionsId Record Id (identity column). Populated automatically
TenantIid Foreign key. Id of the database tenant.
RuleGroupId Reserved for future use.
DatabaseName The Database name of the base table to which the rule is applied. E.g.
InsightStaging
SchemaName The schema of the base table to which the rule is applied. E.g. dbo
TableName The name of the custom base table to which the rule is applied. E.g.
StagingCustomer
ExecutionPhase If the rule defined is applied to the InsightStaging database, this column
stores the stage of Analytics ETL in which the operation will take place.
Generally should be ‘Extract’ unless not possible. If not possible, verify
with BI Analyst if there needs to be another value, such as ‘Transform’.
ExecutionStep If an execution phase is defined, this column will store the Step in which
the rule is applied within the selected phase. This parameter is relevant if
one rule is dependent on another within the same phase. E.g. The first
rule can be assigned a 1 and the second 2.
IsPersisted This column defines if the business rule will be applied to a table or a
view. Acceptable values are:
1 – Add a physical column to a base table.
0 – Create a view around a base table with the column(s) added to the
view.
In R18, option 0 is not in use and 1 will be the default.
Description Descriptive information about the rule
SourceItemId Computed column storing the hashed unique ID of the Rule definition.
ItemName Rule label structured as follows <ObjectName>-
<ColumnNameToWhichTheRuleIsApplied> E.g. GLTran-
SourceCurrencyID

Page 109 | 335


Advanced Analytics Platform Technical Guide

Operation Type of rule defined. Acceptable values are Lookup, Banding, Calculation,
Split, MaxColumn and Dataset.
SourceSQLStatement This the source SQL statement used by the business rule if the Operation
selected is Dataset.
DataSetDescription Descriptive information about the source SQL statement, if populated.
SourceStoredProcedure Reserved for future use.
Name
DatasetId Reserved for future use.
CreatedBy Name of the user or Process that created the Business rule.
CreatedDate Date in which the Business rule was created.
LastModifiedBy Name of the user or Process that last modified the Business rule.
LastModifiedDate Date in which the Business rule was last modified.
BusinessDateFrom Date from which the Business rule becomes active. This date is set to
1900-01-01 but this can be modified for temporal metadata (future use.).
BusinessDateTo Date until which the Business rule remains active. This date is set to 9999-
12-31 but this can be modified for temporal metadata (future use).
IsActive If set to 1, this flag makes the rule active. If set to 0 or empty, the rule is
inactive.
IsCurrent Reserved for future use.
IsPublished Reserved for future use.
SourceTableName Reserved for future use.
SourceTableSchemaName Reserved for future use.

SourceTableDatabase Reserved for future use.


Name
TargetDatabaseName Name of the database hosting the target table used if the Operation
column is set to Lookup or Custom Table.
TargetSchemaName Name of the schema of the target table used if the Operation column is
set to Lookup or Custom Table.
TargetObjectName Name of the table used if the Operation column is set to Lookup or or
Custom Table.
TargetViewName Name of the target view used. Using this column turns a regular Lookup
into a CustomTable.
Configuration This column is used to make version control in this table more consistent
and easier to manage. Configuration defines what the source for the
current row is in the table (provided out-of-the-box by Temenos, added
later by the client as a result of local development etc.) – available values
are:
- ModelBank: this entry has been added to satisfy Temenos core
banking mapping and/or business rules
- Local: the entry is used during the implementation to update or
enhance Framework or ModelBank functionality
- Framework: this entry has been added by the TFS Framework
solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record definitions

Page 110 | 335


Advanced Analytics Platform Technical Guide

- CampaignAnalytics: Used for Campaign Analytics solutions


- Predictive: Used for Predictive Analytics solution when deployed
RecordSource Reserved for future use.
SourceRuleDefinitionId Hashed id used to uniquely identify a rule definition when importing rules.

UpdateNumber Reserved for future use.


IsTemporal If set to 1 this columns defines that the business rule is temporal.
Currently set to 0 for all Rule Definitions and sub tables.
PreviousIsTemporal Reserved for future use.
RuleDefinitionRowhash Hashed rule definition. This column is used to identify changes to rules.
DeletedBy Name of the user or Process that deleted the Business rule.
DeletedDate Date in which the Business rule was deleted.
LegacyItemId Reserved for future use.
HashTag Reserved for future use.
TableFilter Reserved for future use.

dbo.RuleColumns
This table stores the details of the columns populated by the business rules.

The RuleColumns table has the following columns:

Column Name Description


RuleColumnId Record Id (identity column). Populated automatically
SourceRuleColumnId Hashed unique identifier for source data.
RuleDefinitionId Foreign key to RuleDefinitions
CustomDataSubItemId Reserved for future use.
ColumnOrder Order of columns e.g. 1. This is very important when the rule
type/operation is ‘Lookup’. In this case, ColumnOrder is used to match
columns with the corresponding RuleValues: Key/CustomColumn1,2 etc.
ColumnContents The name of the column affected by the rule. E.g. GLDescription.
ColumnType Type of column. Acceptable values are:
 Keycolumn: – The column(s) that are used for:
o Lookup: Join the lookup table (derived table based on
RuleValues) to the base table.
o Banding: The Column on which the banding is based.
o Dataset: Join the dataset to the base table.

 CustomColumn: – The columns added back to the base table


as a result of the rule. Not applicable to Split, calculation and
Maxcolumn.

Page 111 | 335


Advanced Analytics Platform Technical Guide

 ColumnName – The column name of a calculation, works


effectively like a CustomColumn above.
SQLExpression The expression for rules with operation = ‘Calculation’. E.g. Case When
Column1 = ‘1’ Then ‘Yes’ Else ‘No’ End.
Delimiter Reserved for future use.
NumberofSplits Reserved for future use.
DataType The datatype of the column added as a result of the rule. Applicable to
columns with type equal to CustomColumn and ColumnName.
ColumnNameSplitSuffix The suffix of a created Split column. For example if the Column ColumnA
is split, and the suffix is “Comp” the second split would be
ColumnA_Comp_2.
CreatedBy Name of the user or Process that created the entry.
CreatedDate Date in which the entry was created.
LastModifiedBy Name of the user or Process that last modified the entry.
LastModifiedDate Date in which the entry was last modified.
BusinessDateFrom Date from which the Business rule becomes active. This date is set to
1900-01-01 but this can be modified for temporal metadata (future use.).
BusinessDateTo Date until which the Business rule remains active. This date is set to 9999-
12-31 but this can be modified for temporal metadata (future use).
ColumnRankOrder Reserved for future use.
RankOrderDirection Reserved for future use.
KeyColumnOrder Reserved for future use.
IsTemporal Reserved for future use.
RuleColumnRowHash Hashed rule column definition. The hash is used to detect changes to the
record used for system load purposes.
IsActive If set to 1, the column is active. If set to 0 or left empty, the rule is
inactive.
IsCurrent Reserved for future use.
PreviouslyTemporal Reserved for future use.
DeletedBy Name of the user or Process that deleted the entry.
DeletedDate Date in which the entry was deleted.
ColumnValue Reserved for future use.
ColumnFilter Reserved for future use.
SourceRuleColumnItemId Reserved for future use.
JoinTableName Reserved for future use.
JoinColumnName Reserved for future use.

dbo.RuleValues
This table is used to store the values of the virtual lookup table for each Operation = Lookup rule. It
performs a role similar to CustomValue’s in pre-R18 releases.

The RuleValues table has the following columns:

Column Name Description


Page 112 | 335
Advanced Analytics Platform Technical Guide

RuleValueId Record Id (identity column). Populated automatically


CustomValueId Reserved for future use.
SourceRuleValueId The hashed natural key of the table.
RuleDefinitionId Foreign key back to the rule definition.
ValueOrder Reserved for future use.
KeyValue1-15 The Key Values are used to join back to the base table, and as a key for
the lookup table. E.g. KeyValue1, could be mapped to ProductCode. This
is done in RuleColumns. ColumnType would be keycolumn, Operation
would be set to ‘Lookup’ and ColumnOrder would be 1.
CustomValue1-15 Each Custom Value correspond to the Key Value with the same sequence
number. E.g. The KeyValue1 (ColumnContents = ProductCode) = 55,
could map to CustomValue1 = Mortgage.
In the corresponding RuleColumns entry there would be a columnType =
‘CustomColumn’ for Operation = ‘lookup’ and ColumnOrder would be 1.
ColumnContents would be ‘ProductType’
CreatedBy Name of the user or Process that created the entry.
CreatedDate Date in which the entry was created.
LastModifiedBy Name of the user or Process that last modified the entry.
LastModifiedDate Date in which the entry was last modified.
IsPublished Reserved for future use.
BusinessDateFrom Date from which the Business rule becomes active. This date is set to
1900-01-01 but this can be modified for temporal metadata (future use.).
BusinessDateTo Date until which the Business rule remains active. This date is set to 9999-
12-31 but this can be modified for temporal metadata (future use).
IsActive If set to 1, the column is active. If set to 0 or left empty, the rule is
inactive.
IsCurrent Reserved for future use.
IsTemporal Reserved for future use.
PreviouslyTemporal Reserved for future use.
RuleValueRowHash Hashed rule value definition. The hash is used to detect changes to the
record used for system load purposes.
DeletedBy Name of the user or Process that deleted the entry.
DeletedDate Date in which the entry was deleted.
SourceRuleValueItemId Reserved for future use.
ColumnRankOrder Reserved for future use.
RankOrderDirection Reserved for future use.
KeyColumnOrder Reserved for future use.
IsTemporal Reserved for future use.
RuleColumnRowHash Hashed rule column definition. The hash is used to detect changes to the
record used for system load purposes.
IsActive If set to 1, the column is active. If set to 0 or left empty, the rule is
inactive.
IsCurrent Reserved for future use.
PreviouslyTemporal Reserved for future use.
Page 113 | 335
Advanced Analytics Platform Technical Guide

DeletedBy Name of the user or Process that deleted the entry.


DeletedDate Date in which the entry was deleted.
ColumnValue Reserved for future use.
ColumnFilter Reserved for future use.
SourceRuleColumnItemId Reserved for future use.
JoinTableName Reserved for future use.
JoinColumnName Reserved for future use.

dbo.RuleExpressionLevels
This table stores rules expression levels and is used for Banding only. The RuleExpressionLevels table has
the following columns:

Column Name Description


RuleExpressionLevelsId Record Id (identity column). Populated automatically
SourceRuleExpression Hashed entry id.
LevelsId
RuleColumnId Foreign key from the RuleColumn table.
RuleDefinitionId Foreign key from the RuleDefinition table.
BandOrder Order with which rule expression levels are applied to data.
SQLExpression The SQL expression to produce the bands required. One or many. Eg.
Between 0 and 10
Between 10 and 20
> 20
BandName The labels corresponding to the bands defined in SQL Expression
RuleKeyColumnId Rule Key Column Id
RuleColumnOrderId Reserved for future use.
CreatedBy Name of the user or Process that created the entry.
CreatedDate Date in which the entry was created.
LastModifiedBy Name of the user or Process that last modified the entry.
LastModifiedDate Date in which the entry was last modified.
BusinessDateFrom Date from which the Business rule becomes active. This date is set to
1900-01-01 but this can be modified for temporal metadata (future use.).
BusinessDateTo Date until which the Business rule remains active. This date is set to 9999-
12-31 but this can be modified for temporal metadata (future use).
IsActive If set to 1, the column is active. If set to 0 or left empty, the rule is
inactive.
IsCurrent Reserved for future use.
IsTemporal Reserved for future use.
RuleExpressionLevelRow Hashed column definition. The hash is used to detect changes to the
Hash record used for system load purposes.
PreviouslyTemporal Reserved for future use.

Page 114 | 335


Advanced Analytics Platform Technical Guide

DeletedBy Name of the user or Process that deleted the entry.


DeletedDate Date in which the entry was deleted.
SourceExpressionLevels Reserved for future use.
ItemId

dbo.RuleFilters
This table is designed to store optional filters for rules with Operation = ‘MaxColumn’ only. It will be used
for future development.

dbo.RuleCustomers
This table is used for GDPR configuration and it contains the details of customers who have erasures
initiated, in progress or have completed erasures.

dbo.RuleCustomersRuleColumns
This table is used for GDPR configuration. It represents an intersection between RuleCustomers and
RuleColumns. It contains the erasure date (ActionDate) for a particular column/ customer combination, as
well as the erasure replacement value.

dbo.RuleReplacements
This table is used for GDPR configuration. It contains replacement values based on datatype and table type
(dimension or other) for Analytics (InsightWarehouse) erasures.

dbo.CDPPurpose
This table is used for GDPR configuration. It contains retention periods for difference column purposes.
Used to calculate the ErasureDate of a column. Only necessary for Analytics (InsightWarehouse) erasures.

dbo.RuleDataQuality
This table is where all the DQI (Data Quality Import) rules are defined, except for the hard-coded Failover
rules. Each row in this column corresponds to a rule defining how to handle the replacement of a column
with a specific data type.s

Each rule is applied to a column, namely either on its data type or on its value. There are four types of
configurable DQ rules, listed in the order of high to low precedence as follows:

1. RegEx rule;

2. Equal rule;

3. General rule;

4. Default rule.

RegEx, Equal and General rules are defined to match on data values but Default and Failover rules are set
for data types. Default rules are pre-installed during the setup, with RuleColumnId always equals to -1 (see
below).

Hard-coded rules are specially designed as the last means to automatically revive a failing Analytics ETL.
Failover rules remain inactive until in the rare event that all other DQI rules, even the Default ones, are for
some reason unavailable. Beside the Failover rules, there are also a set of hard-coded special General rules.
The former are deployed in memory with binary CLR code and the latter are deployed to RuleDataQuality

Page 115 | 335


Advanced Analytics Platform Technical Guide

through the s_MergeKeyColumnDQRules procedure to ensure all the primary-key-to-be columns are never
imported with nulls.

Except the special General rules enforcing primary key columns to be non-null, any other DQI Replacement
value defined in the RuleDataQuality table can be customized to client’s preference.

Due to the SQL_VARIANT type that Replacement columns employ, a special syntax with explicit
CAST/CONVERT must be used, otherwise the text directly typed into table is always interpreted literally as
NVarChar.

To end with, it should be noted that multiple DQI rules can be defined for a column. Rules are chained in
series to apply when checking and revising the column considered. Rules having the same precedence level
are randomly applied one after the other until no more DQI issues are detected. If all the rules with the
same precendence are applied but further DQI errors are detected, the next level of rule with lower
precedence will activate to try to resolve the issue in the problematic column, if more rules are defined.

Creation of new rules


There is default rows-set shipped with installer in RuleDataQuality table. If additional DQI rules are
required, the new rules definition need to be added first as new entries in the
InsightETL.dbo.RuleDefinitions and InsightETL.dbo.RuleColumns tables. The RuleColumnId column in
RuleDataQuality table refers to the same-named column in RuleColumns.

For a RuleDefinitions record to be correctly associated by the DQI process with the corresponding data
quality rules, it must have the following columns’ attributes set:

 DatabaseName = ‘InsightImport’;
 SchemaName and TableName are set to the target table;
 Operation = ‘DataQuality’ or ‘Data Quality’;
 IsActive = 1;
 IsCurrent = 1.

For DQI to recognize the relevant data quality-related entires in RuleColumns, they must have the following
columns’ attributes set:

 RuleDefinitionId must have a proper ID that linked to RuleDefinitions for DQ;


 ColumnContents is set to the target column name;
 ColumnType = ‘DataQuality’ or ‘Data Quality’;
 IsActive = 1;
 IsCurrent = 1.

Important Note: RuleDataQuality is very sensitive and will get corrupted if not
updated properly. Replacement for bad data with default values defined in this table.
We can’t directly edit this table as it is having sql_variant and computed columns but
only insert/update using SQL script with target type converted

Page 116 | 335


Advanced Analytics Platform Technical Guide

Figure 19 - Sample of update script for the RuleDataQuality table

Column Name Description


RuleDataQualityId Record Id (identity column). Populated automatically
IsDefaultForDataType For a Default rule, this computed column indicates if it is
applied to the data type (1) or to the affected value (0).
RuleColumnId For locally developed rules, this column is a foreign key
that references to the column with the same name in the
InsightETL.dbo.RuleColumns table.
Rows having ‘RuleColumnId = -1’, instead, refer to the
default row set which should never be removed but can
be customized.
IsForMatchedOnly Computed column indicating whether Replacement
column’s content (see below) is only applied to the
matched values. If the value of this field is 1, the Match
column must be populated.
DataType Computed column showing the data type of the
Replacement column’s value E.g. nvarchar, money, date
etc.
PreferNull When PreferNull is activated and the target column is
nullable and NULL will be used rather than the
Replacement value. This column only accepts two values
i.e. 1 (active) and 0 (inactive).
Replacement This column defines the actual replacement value for
any column whose data type matches with the value
defined in the DataType column and that presents a
Data Quality issue. E.g. Replacement value can be an
empty string for nvarchar data type, 0 for money, 1900-
01-01 for date etc.
Replacement is of SQL_VARIANT type. While adding
new rules programmatically, the actual underlying type
must be explicitly provided by converting the
replacement value to specific data type, e.g.
Convert(sql_variant, Cast(-1 as int)).
Match In addition to replacing corrupted column data, financial
institutions can also create DQI rules indentifying and
replacing attributes that match with a certain value or a
certain range of values. This can be implemented using
the Match column. This column is always null for Default
and General rules, but can be used for the definition of
RegEx and Equal rules.
The Match column can store one of the following values:

Page 117 | 335


Advanced Analytics Platform Technical Guide

 Null – DQI unconditionally replaces the


corrupted data without having to match any
value or pattern;
 String literal – DQI replaces the value only
when it equals to the string literal, in case-
insensitive manner;
 Non-string-typed value – DQI replaces the
value only when it equals to the typed value.
The comparison must be between two values
having the same identical data type;
 Regular Expression pattern – DQI replaces
various values only when they match the RegEx
pattern (case-sensitive). The
IsMatchedByRegEx column must be 1 to
activate this mode. The RegEx option in use is
IgnorePatternWhitespace. For detecting Null,
please use ‘Is DBNull’; for detecting Date,
DateTime, DateTime2, SmallDateTime or
DateTimeOffset type of data, use only ISO8601
roundtrip format, e.g. formatted as yyyy-MM-
ddTHH:mm:ss.fffffffzzz.

IsValidMatchReplacement Computed column indicating whether the underlying


data types of the values in the Match and Replacement
columns are identical.
IsMatchedByRegEx Defines if the DQI business will replace values when they
match the RegEx pattern (see Match column). This
column only accepts two values i.e. 1 (Yes) and 0 (No).
MaxLenghtInBytes Computed column showing the maximum length of
Replacement value in bytes.
Precision Computed column indicating the numeric precision. E.g,
the number 123.45 has a precision of 5.
Scale Number of digits to the right of the decimal point in a
number. E.g. the number 123.45 has a scale of 2.
Comments Any descriptive comments about the DQI rule.
CreatedBy User or process who created the rule. E.g. Framework.
CreatedDate Date and time in which the rule was created. E.g. 2018-
05-11 11:10:43.987
LastModifiedBy Last user who modified the rule.
LastModifiedDate Date and time in which the rule was last modified.
DeletedBy User or process who deleted the rule.
DeletedDate Date and time in which the rule was last deleted.

dbo.AttributeCalculations
This system configuration table is used to define string operations (i.e. split, calculations and dataset
creation) in columns processed during Analytics ETL/Process Data ExStore. Operations in this table can be

Page 118 | 335


Advanced Analytics Platform Technical Guide

used to both improve performances of Analytics ETL by splitting up compound column values and to create
brand new columns with the aid of T-SQL functions or stored procedures.
For example, AttributesCalculation can be used to define that the compound id of a LIMIT record should
be split into three parts to ease the identification of the customer @Id in the record (which is part of the
limit id but not part of the limit record), then place the extracted Customer @Id in a separated column in
LIMIT.
Or, e.g., AttributesCalculation can be used to create a new column in the ACCOUNT table which results
from the calculation of the limit reference number from compound LIMIT.REF field.
AttributeCalculations is an optional table and will be only made available if the size of a client’s database
requires it.
Even though AttributesCalculations is capable of defining working splits, calculations and datsets in both
R17 and R18 releases, from R18 release it is recommended to use this table only for the local development
of Splits. Locally developed Datasets and Calculations should be defined using the Data Manager (AKA
Rules Designer) functionality of the Analytics Front End Web application, instead. The details of these
Dataset and Calculation rules will be stored in the back end through the Rule Engine-related tables of
InsightETL.

ColumnName Description
DatabaseName Name of the database hosting the table in which the
operation or calculation or dataset calculation is done.
SchemaName Schema name of the table in table in which the
operation or calculation or dataset calculation is done.
TableName Name of the table in which table in which the
operation or calculation is done.
ColumnName The use of this column depends on the value of the
Operation column associated with it:
For Split operations – Name of the column in which
the split operation is done
For Calculation operations – Not Used
For Dataset operations – Name of the column
storing the results of the Dataset’s stored procedure
ColumnOrder Position of the column subject to operation or
calculation in its table (e.g. 1 for @Id in CUSTOMER
table)
Operation Specifies if the entry defined is used for a Split or a
Calculation or a Dataset
- Split - the operation defined in this entry will
split the compound value of a specific column
in a tabled on the delimiter and add the
resulting values as new columns, appended
at the end of the table.
- Calculation – the current entry defines a
calculation which can be performed on one or
multiple columns. The resulting value can
either be used to overwrite the old value of a
column in the table or be appended as new
column at the end of the table

Page 119 | 335


Advanced Analytics Platform Technical Guide

- Dataset – this type of entry defines a


column calculation which is performed by a
stored procedure. This stored procedure is
executed as part of Analytics ETL, in the
InsightStaging database, and populates the
column set as ColumnName in the Dataset
definition

Delimiter Delimiter to be used for a Split operation only (does


not apply to Calculations or Datasets). Only one
character delimiters are supported. E.g. -
SQLExpression The SQL expression to be used for a Calculation
operation (does not apply to Splits or Datasets).
E.g.
CASE WHEN MAT_DATE = 0
THEN ‘19000101’
ELSE MAT_DATE
END
ColumnSplitNameSuffix Suffix appended to the name of new column created
at the end of a table to store the results of a Split
(does not apply to Calculations or Datasets). The new
column name will have the following syntax:
ColumnName + ColumnSplitNameSuffix + Sequence
Number
E.g. If the ColumnSplitNameSuffix is Comp and
ColumnName is @Id, the names of the new columns
containing the split will be
@Id_Comp1, @Id_Comp2 … @Id_CompN
CalculationColumnName Name of the column appended to the end of the table
to store the output value of a Calculation (does not
apply to Splits or Datasets). Note: when the
Operation is set to Calculation and the DatabaseName
is set to InsightSource, the CalculationColumnName
should be prefixed by ETL E.g. ETL_PRINCIPAL
DataType The data type of all the results for any kind of Split
operation or Calculation (this field is not used for
Datasets). E.g. Nvarchar(100)
Configuration Defines the type of configuration for this entry.
Acceptable values are:
- Framework: which means it has been
added by the TFS Framework solution and it
is core banking agnostic
- ModelBank: this entry has been added to
satisfy Temenos core banking mapping
and/or business rules
- Local: the entry is used during the
implementation to update or enhance
Framework or ModelBank functionality

Page 120 | 335


Advanced Analytics Platform Technical Guide

- PBModelBank: used for Private Banking


record definitions
- CampaignAnalytics: used for Campaign
Analytics solutions
- Predictive: used for Predictive Analytics
solution when deployed

DatasetName Name of the temporary table storing the results of the


Dataset stored procedure in InsightStaging (only
applies to Dataset operations). E.g.
StagingAccountAggregations
DatasetJoin Id Column of the temporary InsightStaging table
mentioned in DatasetName. This column can be used
within the JOIN clause/s in the Dataset stored
procedure (only applies to Dataset operations) E.g.
SourceAccountId
DatasetStoredProcedureName Name of the stored procedure used to perform the
required calculations. The results of said calculations
will be stored in the column mentioned in
ColumnName (only applies to Dataset operations).
E.g. s_StagingAccountAggregations
DatasetSchemaName Schema of the temporary InsightStaging table
mentioned in DatasetName (only applies to Dataset
operations). E.g. dbo
ExecutionPhase Defines the Analytics ETL Execution Phase in which
the Split or Calculation be executed e.g. Extract
This column does not apply to Datasets. Also, this
column does not apply to Split or Calculations that are
not executed as part of the core ETL e.g. they are run
in the InsightImport database
ExecutionStep Defines the Analytics ETL Execution Step in which the
Split or Calculation be executed e.g. 2.
This column does not apply to Datasets. Also, this
column does not apply to Split or Calculations that are
not executed as part of the core ETL e.g. they are run
in the InsightImport database
Enabled_ Defines if the Calculation, Split or Dataset should be
processed or not during AnalyticsETL/Process Data
ExStore. Acceptable values are 1 (meaning Yes) or 0
(meaning No).

dbo.Indexes
This configuration table is used to create additional indexes in tables of the InsightSource database,
especially indexes on foreign keys.

If we are using complex v_source views on large datasets, the core Extraction and Transformation phases
of the Analytics ETL – in charge of performing string manipulation and source to target mapping –can run
into performance issues. This generates the need for creating additional indexes in the InsightSource

Page 121 | 335


Advanced Analytics Platform Technical Guide

database that can be configured through this table (even though, it can be potentially used also to create
indexes in other databases, e.g. Landing).

ColumnName Description
TableName Name of the table for which the index is created.
E.g. ACCOUNT
ColumnName Name of the column for which the index is created.
E.g. Categ_Entry
ColumnOrder The order of the column in the index that will be
created.
E.g. 1
IndexNumber If set to 1, all the columns in a table will be in one
index. Else, multiple indexes will be created in the
same table.
E.g. 1
IndexType Type of index, i.e. CLUSTERED or NONCLUSTERED.
The default value is NONCLUSTERED.
E.g. NONCLUSTERED
ColumnUsage Defines if the column usage as an index and
acceptable values are:
- Equality: for CLUSTERED or
NONCLUSTERED key columns
- Include: to be used if a nonclustered index
is extended by including non-key columns in
addition to the index key main columns.
- NONCLUSTERED
E.g. Equality
IndexSortOrder Sorting order for the index. E.g Asc or Desc.
DatabaseName Name of the database hosting the table for which the
index is created.
E.g. InsightSource
SchemaName Schema of the table where the index will be created.
E.g. BS
UniqueIndex Defines if the index is unique or not, depending on
the index type. Acceptable values are 1 (meaning
Yes) or 0 (meaning No).
Configuration Defines the type of configuration for this entry.
Acceptable values are:
- Framework: which means it has been
added by the TFS Framework solution and it
is core banking agnostic
- ModelBank: this entry has been added to
satisfy Temenos core banking mapping
and/or business rules

Page 122 | 335


Advanced Analytics Platform Technical Guide

- Local: the entry is used during the


implementation to update or enhance
Framework or ModelBank functionality
- PBModelBank: used for Private Banking
record definitions
- CampaignAnalytics: used for Campaign
Analytics solutions
- Predictive: used for Predictive Analytics
solution when deployed

Enabled_ Defines whether the index is enabled (value set to 1)


or not (value set to 0)

dbo.Batch
This log table stores general info for each Analytics ETL batch execution in InsightImport, Landing and
Source. It contains the following columns:

ColumnName Description
BatchId Record Id (identity column).
StartTime Date and time in which the batch started running.
EndTime Date and time in which the batch stopped running.
BatchStatus Processing status of the batch E.g. Processing,
Started etc.
LastEventLogID Id of the last event logged in EventLog and
EventLogDetails by this batch (only for multi-
threaded procedures in InsightImport, Landing and
Source)
LoginName The SQL Login whose credentials are used by the
batch to run. The format used will be Domain\Login
Name e.g. ANALYTICSSERVER\AnalyticsUser
HostName The server name in which the batch was running.
IsActive If set to 1 (e.g. True) this column defines that the
batch is currently active, while 0 (e.g. False) means
that the batch is inactive.

dbo.EventLog
This log table stores an entry for each main event and of each Analytics ETL batch.
Any successfully executed event taking place in the Advanced Analytics platform is recorded in this table
and each row represents a normal event.
If exceptions are encountered, however, an additional record to summarize the outcome of the failed task
will be written to the log. Also, a more will be a detailed log of the exception will be stored in
EventLogDetails, StagingEventLog and StagingEventLogDetails tables discussed below.

ColumnName Description
EventLogId Record Id (identity column).

Page 123 | 335


Advanced Analytics Platform Technical Guide

BatchId Id of the Batch which executed the task


SourceObject Name of the source object involved in the task.
E.g. if the task consists of copying the CUSTOMER
table from InsightImport to Landing, the
SourceObject will be
(BS) InsightImport.dbo.CUSTOMER
SourceRows Number of rows involved in the task
E.g. If the task consists of copying 42 rows from a
table to another, the value is SourceRows will be 42
TargetRows Number of rows the task gave as an output
E.g. If the task consists of copying 42 rows from a
table to another but 10 are filtered out by a WHERE
clause, the value is TargetRows will be 32
TargetObject Name of the target object involved in the task
E.g. 20160425BS.CUSTOMER
EventTime Date and time in which the event took place
E.g. 2017-02-21 11:35:17.080
ElapsedTime Time elapsed between the beginning of an event and
its completion
E.g. 00:00:00.023
InvolvedModule Stored procedure executed
E.g. InsightImport.Insight.s_ImportTable_Update
Severity Type of event logged. Available values are:
- Warning (for non-critical exceptions
encountered)
- Error (for errors which stopped the process)
- Information (for normal events)

Information Text providing information on the task executed and


its outcome.
E.g. s_Import_Control started (for Information log)
Could not dump data to *_tmp tables (for Warning)
Failed in creating a primary key. Check
[InformationDetails] (for Error)
InformationDetails Text providing more specific information regarding
the event if applicable
SQLStatement Original SQL Statement (action query) executed if
applicable. This may be in XML format

dbo.EventLogDetails
This log table contains further details about the information logged in the EventLog table, but only for tasks
executed within Analytics ETL in the InsightImport, InsightLanding or InsightSource databases. Any entry
which is recorded here will have an EventLogId and will points back to its parent table EventLog.
Multithreading information from the XML output parameter of the Insight s_ParallelExecActionQueriesTx
stored procedure is parsed into this table. s_ParallelExecActionQueriesTx enables enhanced multi-threading

Page 124 | 335


Advanced Analytics Platform Technical Guide

in the InsightImport, Landing and Source databases and is described in detail in the chapter about the
Insight database.
EventLogDetails does not store details logs for the core Extraction, Transformation and Load phases of
Analytics ETL, which are taking place in the InsightStaging database. InsightStaging-related logs are
instead recorded separately in the StagingEventLog and StagingEventLogDetails table, also hosted in
InsightETL.

ColumnName Description
EventLogDetailsId Record Id (identity column).
EventLogId Record Id of the corresponding EventLog entry
(foreign key).
EventTime Date and time in which the event took place
E.g. 2017-02-21 11:35:17.080
Action Action performed. Currently not in use.
ElapsedTime Time elapsed between the beginning of an event and
its completion
E.g. 00:00:00.023
ObjectTargeted Name of the object (e.g. the table) targeted by the
task, if applicable
E.g. AA_ARR_CUSTOMER
InvolvedModule Stored procedure executed, if applicable
E.g. InsightImport.Insight.s_ImportTable_Update
Severity Type of event logged. Available values are:
- Warning (for non-critical exceptions
encountered)
- Error (for errors which stopped the process)
Information (for normal events)
TaskStatus Status of the task executed, if applicable. Available
values are RanToCompletion, Faulted or null.
RowsAffected The number of rows affected by the task, if
applicable.
E.g. 42
Information Text providing information on the task executed and
its outcome.
E.g. s_Import_Control started (for Information log)
Could not dump data to *_tmp tables (for Warning)
Failed in creating a primary key. Check
[InformationDetails] (for Error)
InformationDetails Text providing more specific information regarding
the event if applicable
SQLStatement Original SQL Statement (action query) executed if
applicable. This may be in XML format

Page 125 | 335


Advanced Analytics Platform Technical Guide

dbo.StagingEventLog
This log is updated by the stored procedures executing Extract, Transform and Load phases during the
core Analytics ETL process in InsightStaging. It tracks execution times for extract, transform and load
stages, rows processed, type1 or 2 changes, the number of updates and error messages for each
dimension, fact and bridge table. Use this log to review record counts, execution times or ETL run progress.

It stores general information regarding each table involved in the Staging process and EventLogId column
points back to its parent table EventLog.

Column Name Description


StagingEventLogId Unique identifier
EventLogId Id of the corresponding EventLog entry (foreign key)
Severity Type of event logged. Available values are:
- Warning (for non-critical exceptions encountered)
- Error (for errors which stopped the process)
- Information (for normal events)
TableName Name of the Warehouse table eg. DimAccount
StagingBatch Id of the batch which executed the process
StagingBatchStart Date/Time the ETL started in InsightStaging
TableStart Time the table started to process
ExtractStart Time the extract phase started
RowsExtracted The number of rows extracted from the InsightSource tables.
ExtractFinish Time the extract phase completed.
TransformStart Time the transform phase started
RowsTransformed The number of rows transformed from the InsightSource tables.
TransformFinish Time the extract phase completed.
LoadStart Time the load phase started
NewRows The number of rows extracted from the InsightSource tables.
NewRowsTime Time the load phase completed.
Type1Changes Number of Type 1 dimension change
Type1ChangesTime Time of the Type 1 changes
Type2Existing Number of Type 2 changes
Type2ExistingTime Time of the Type 2 changes
Type2Inserts The number of type 2 record inserts, records should only be inserted
when there is a change to a customer record.
Type2InsertsTime Time of the Type 2 inserts
CaseChanges Number of case changes
CaseChangesTime Time of the case changes
LoadFinish Time the Load phase completed
TableFinish Time the table completed
StagingBatchFinish Date/Time the ETL completed in InsisghtStaging
ExtractSeconds Elapsed processing time of extract phase
TransformSeconds Elapsed time of transform
LoadSeconds Elapsed time of load
NumberUpdates Number of dimension updates
LastErrorMessage Last error message logged
WarningMessage Last warning message logged

Page 126 | 335


Advanced Analytics Platform Technical Guide

dbo.StagingEventLogDetails

This table stored detailed information of tasks executed during the ETL process in InsightStaging for each
table involved. This is a transactional style log is updated by the InsightStaging stored procedures. Each
procedure will write to the StagingEventLogDetails at the beginning of the procedure and before any major
steps within the procedure. This log can be used to investigate process interruptions or failures, or specific
procedure execution times for core Extraction, Transformation and Loading phases of Analytics ETL.

StagingEventLogId column points back to its parent table StagingEventLog

Column Name Description


StagingEventLogDetailsId System generated id
StagingEventLogId Foreign Key to Staging Event Log
EventTime The date/time of the step
Action_ Extract/ Transform/ Load
ElapsedTime Time elapsed from the beginning to the end of the task
ObjectTargeted Name of the object (e.g. the table) targeted by the task, if
applicable
E.g. DimAccount
InvolvedModule Stored procedure executed, if applicable
E.g. InsightImport.Insight.s_ImportTable_Update
Severity Type of event logged. Available values are:
- Warning (for non-critical exceptions encountered)
- Error (for errors which stopped the process)
- Information (for normal events)
RowsAffected Number of rows returned
Information Text providing information on the task executed and its
outcome.
E.g. s_Extract_SourceTable started (for Information log)
Could not dump data to *_tmp tables (for Warning)
Exception encountered while creating primary key
PK_sourceEmployeeBSDAO to table
dbo.sourceEmployeeBSDAO (for Error)
InformationDetails Text providing more specific information regarding the event
if applicable
SQLStatement Original SQL Statement (action query) executed if applicable

dbo.CurrentDate
This table stores the date of the current ETL run.

For s_InsightLandingTable_Update this date must match the date of the data being loaded into
InsightLanding.

s_InsightStaging_update, i.e. the stored procedure which updates InsightStaging and carries out the core
Analytics ETL functionalities, will not run unless this date is the same as the date of the data being run.

Column Name Description

Page 127 | 335


Advanced Analytics Platform Technical Guide

BusinessDate The date that will populate the Businessdate field in


InsightWarehouse.

dbo.SourceDate
This table stores the business date that will be used for each source system, and maps that date to the
actual business date. In the case where there are multiple entities for a source system with different dates,
there will be one row for the base source date and additional rows for each entity.

When a historical run is initiated, this table is used to pull the appropriate table based on the InsightLanding
table schema name. For cases where the business date may be different for different sources, e.g. BS
source vs Budget source, it is critical this table is setup correctly.

Column Name Description


SourceId Record Id (identity column). Populated automatically
BusinessDate The date that will populate the Businessdate field in
InsightWarehouse.
SourceSystem The name of the source system should match the
sourcename in InsightLanding..ExtractList. When the
SourceDate can be different for each business entity the
source system will be shown as Source:Entity. Eg.
BS:BNK.
SourceDate The date returned by
InsightLanding..ExtractSourceDate query. The actual
date of the data.
IsDateOverride This yes/no can be used to override the automatic date
selection. If the value is ‘yes’, the system uses the
SourceDate entered. If the value is ‘no’ or ‘null’, the
system will use the ExtractSourceDate query in
InsightLanding. This would typically be used for
alternate data sources where you are matching monthly
data (Credit Card/MF/Budget) to daily banking data.
OnlineBusinessDate The current Import Online business date

dbo.Translation
This table is used to store report label translations.

Column Name Description


TranslationId Record Id (identity column). Populated automatically
ObjectType This must be one of ColumnName, CubeHierarchy,
ReportLabel, ReportParameter, ColumnName
Item Name Source item name. e.g. HasProduct1
DisplayName Value to replace Item Name. e.g. Has Savings
Format Number format e.g. d (dates), n0 (number, 0 decimals)
DisplayFolder Folder name for groupings e.g. Rates

Page 128 | 335


Advanced Analytics Platform Technical Guide

Language Defines the language for the report label. The same label
can have a different label for each language
implemented in the installation.
Configuration This column is used to make version control in this table
more consistent and easier to manage. Configuration
defines what the source for the current row is in the
table (provided out-of-the-box by Temenos, added later
by the client as a result of local development etc.) –
available values are:
- ModelBank: this entry has been added to
satisfy Temenos core banking mapping and/or
business rules
- Local: the entry is used during the
implementation to update or enhance
Framework or ModelBank functionality
- Framework: this entry has been added to the
TFS Framework solution and it is core banking
agnostic
- PBModelBank: Used for Private Banking
record definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics
solution when deployed

dbo.TableRowCountAudit
This table stores the record count for databases involved in ETL processing (e.g. InsightImport,
InsightLanding etc.). It is populated by the s_PopulateAuditCounts during ETL.

Column Name Description


BatchId Batch Id (identity column).
ETLPhase ETLPhase considered
BusinessDate Business Date considered
DatabaseName Name of the database considered
ExtractSourceName Name of the source table from which data was extracted
SchemaName Schema of the table considered
TableName Name of the table considered
TableCount Record count
TableExists Flag confirming whether the table considered was
created or not. Acceptable values are 1 (i.e. Yes) or 0
(i.e. No).

Page 129 | 335


Advanced Analytics Platform Technical Guide

Online.OnlineBatch
This log table stores general info for each online processing in InsightImport and Landing. It contains the
following columns:

ColumnName Description
BatchId Record Id (identity column).
OnlineBatchStart Date and time in which the online process started
running.
OnlineBatchFinish Date and time in which the online process stopped
running.
OnlineBatchSeconds Delta time between the start and the end of the online
process in seconds
OnlineBatchStatus Processing status of the online process E.g.
CompletedSuccessfully etc.
LastEventLogID Id of the last event logged in OnlineEventLog by this
online process
LoginName The SQL Login whose credentials are used by the
batch to run. The format used will be Domain\Login
Name e.g. ANALYTICSSERVER\AnalyticsUser
HostName The server name in which the batch was running.
IsActive If set to 1 (e.g. True) this column defines that the
online batch is currently active, while 0 (e.g. False)
means that the online batch is inactive.
MIS_DATE Business date in YYYY-MM-DD format
MinOnlineOutputId Id of the Minimum Online output
MaxOnlineOutputId Id of the Maximum Online output
NumRowsToProcess Number of rows to be processed
Severity Type of event logged. Available values are:
- Warning (for non-critical exceptions
encountered)
- Error (for errors which stopped the process)
- NULL (for normal events)

Information Text providing information on the task executed and


its outcome.

Online.OnlineEventLog
This table logs an entry for each main step of an online micro batch process. Detail information of the step
executed will be logged in the EventLog and EventLogDetails table (for multithreading activity).

If exceptions are encountered, however, an additional record to summarize the outcome of the failed task
will be written to the log.

Page 130 | 335


Advanced Analytics Platform Technical Guide

ColumnName Description
OnlineEventLogId Record Id (identity column).
OnlineBatchId Id of the OnlineBatch used to track all activity for the
Online micro batch run
OnlineBatchStart Date and time in which the online process started
running.
Step Description of the online event
StepStart Date and time in which the step started.
StepFinish Date and time in which the step ended.
StepSeconds Duration of the step execution in seconds.
NumTablesProcessed Number of tables processed
NumRowsProcessed Number of rows processed
NumTablesSkipped Number of tables skipped
NumRowsSkipped Number of rows skipped
Severity Type of event logged. Available values are:
- Warning (for non-critical exceptions
encountered)
- Error (for errors which stopped the process)
- Notification (for normal events)

Information Text providing more specific information regarding


the event if applicable. E.g. s_Import_Control started
(for Information log)
Could not dump data to *_tmp tables (for Warning)
Failed in creating a primary key. Check
[InformationDetails] (for Error)

PartitionMeasureGroups (deprecated)
PartitionMeasureGroups was a table used to configure incremental cube processing, if at least one Analytics
Content Package ws installed in the Advanced Analytics Platform. Each entry in the table represented a
measures group.

Column Name Description


MeasureGroupId Numeric Record Id (identity column). Populated
automatically
MeasureGroupName Name of the measure group considered e.g.
CustomerMeasures
CubeName Name of the cube to which the measures group belong
e.g. Customer
DatabaseName Name of the SSAS database where the cube considered
is stored e.g. InsightWarehouseOLAP
CubeType Type of measure stored in the cube E.g. Snapshot
DateDimensionName Date Dimension on which the measure group can be
aggregated e.g. BusinessDate

Page 131 | 335


Advanced Analytics Platform Technical Guide

TableInstructions (deprecated)
This table was used by the InsightDataManager to control which fields was read-only and which functions
are enabled or disabled on a MasterData view.

Column Name Description


Id Record Id (identity column). Populated automatically
TableName Name of the view considered e.g. v_SystemSecurity
Description Description of the view (nulls are acceptable)
EditorSP Currently not in use
HelperSP Currently not in use
ReadColumns List of read-only columns on a view, separated by ;
E.g.
SystemSecurityid;ForeignKey;SourceSystemSecurityID
DefaultFilter Any default filter on the view results
DeleteDisabled If the value of this column is 1, the deletion of normally
editable columns is disabled. 0 or null means that
standard columns can be deleted
EditDisabled If the value of this column is 1, the update of the
normally editable column is disabled. 0 or null means
that standard columns can be edited
AddDisabled If the value of this column is 1, the adding a new value
to normally editable columns is disabled. 0 or null means
that new values can be added to standard columns

CustomColumn (deprecated)
In pre-R18 releases, this table stored data that can be customized for each installation. It had the details
of the column(s) being added and key columns that they were based on. For example, a new Product
hierarchy column can be based on a composite of product code columns. It also stored custom SQL code
that could be inserted into the ETL process.

There were five types of data that could be added to the CustomColumn table:

Type Type Description Description Example


0 Mapping Maps a system A distinct list of Product Codes is
generated distinct list of mapped to a table containing
codes to user defined Classification and Category.
categories Eg. Dmd-reg-24 has a
classification of Chequing and a
category of Deposits
1 Grouping(bucketing or A SQL string defining a A balance group is defined, Eg.
banding) group is mapped to a between 5000 and 25000 has a
description of the group BalanceGroup of 5K – 25K

Page 132 | 335


Advanced Analytics Platform Technical Guide

2 Value Assignment A valid T-SQL expression Eg. SourceEmployeeID2 value


with a column, string, assigned to SourceTellerID
function, etc. that can be
assigned to the Custom
Colum
3 Scripts or custom code The code is stored which Eg. update stagingMember set
is used to set the value NumMemberCard = ‘x’
of fields.
4 Staging table create Copies data from the A rate from MasterData is applied
specified view or table to to values after they are rolled up
a new table in to a Customer level. You would
InsightStaging. copy the rate table as a type’4’
then use custom code type ’3’ for
the calculation.

The following needed to be defined in the CustomColumn table:

Column Name Description


CustomColumnId Record Id (identity column). Populated automatically
Table_ Needs to be the name of the staging table, Determine which object
(staging table) the new column belongs to. For example,
ProductClassification belongs in Account (StagingAccount) since
V_SourceAccountBS data flows into the StagingAccount table.
SourceColumn Type 0 Master Data:
List of columns to be used as the Business key for the mapping that will
take place. The name of the columns needs to be separated with ‘|’
between them.
Type 1 Master Data:
The name of the column which needs to have a grouping defined, for
example, Balance
Type 3 Master Data:
A SQL update query used to update a field in one of the staging tables.
Type 4 Master Data:
The source view name
CustomColumn Type 0 Master Data:
Name of the columns that will be mapped. Multiple columns should be
separated by ‘|’. These are the fields which will be mapped by the
business.
Type 1 Master Data:
The name of the band or group eg. BalanceGroups.
Type 3 Master Data:
The description of what the query does, in the text.
Type 4 Master Data:
The name of the destination table in InsightStaging
Type 0 for lookup tables, 1 for creating groupings, 2 for renaming and 3 for
adding custom code, 4 for staging table create.
UpdateOrder Indicate the order that this operation takes place. This allows dependency
between mappings as long as the update order is accurate. E.g. 9

Page 133 | 335


Advanced Analytics Platform Technical Guide

SourceDataFilter A logical expression that can be added to ‘where clause’ and result in
mapping to be applied to that portion of data. This can be null.
ExecutionPhase Stage of ETL that the operation will take place. Generally should be
‘Extract’ unless not possible. If not possible, verify with BI Analyst if there
needs to be another value, such as ‘Transform’.
ViewName Name of view given in order to abstract the underlying table and easier
to consume by end users. E.g. ProductClassification. If type column is not
set to 0 or 1, this should be left empty.
Configuration This column is used to make version control in this table more consistent
and easier to manage. Configuration defines what the source for the
current row is in the table (provided out-of-the-box by Temenos, added
later by the client as a result of local development etc.) – available values
are:
- ModelBank: this entry has been added to satisfy Temenos core
banking mapping and/or business rules
- Local: the entry is used during the implementation to update or
enhance Framework or ModelBank functionality
- Framework: this entry has been added by the TFS Framework
solution and it is core banking agnostic
- PBModelBank: Used for Private Banking record definitions
- CampaignAnalytics: Used for Campaign Analytics solutions
- Predictive: Used for Predictive Analytics solution when deployed
Enabled_ This column is used in connection with the Configuration column.
Acceptable values for Enable_ are 1 (which means Yes), 0 (which means
No) or NULL (also means NO).
If a table row has the Enabled_ flag set to 1, the Core banking table
definition defined in the row will be taken into consideration during the
Analytics ETL process, otherwise, it will be ignored. Enabled_ is used to
both exclude redundant Core banking tables from being loaded into
InsightImport and to disable obsolete table definitions which should not
be erased or overwritten.

CustomValue (deprecated)
In pre-R18 releases, the CustomValue table contained the values to be mapped. 15 Source and Custom
values are available to be mapped. This is applicable to Type 0 and Type 1 MasterData.

Column Name Description


CustomValueId System generated
CustomColumnId The ID of the parent custom column record
UpdateOrder The order in which the values should be displayed
SourceValue1…SourceValue15 A key column on which the mapping is based. Corresponds to
CustomColumn.SourceColumn. All distinct values of these columns
are populated when s_transform_SME is run.

Page 134 | 335


Advanced Analytics Platform Technical Guide

CustomValue1…CustomValue15 A value to be mapped. Corresponds to


CustomColumn.CustomColumn.

Views
Master Data views are views defined in the CustomColumn and CustomValue tables in order to create
virtual lookup tables to enable easier mapping of custom values. The Data Manager feature in the Analytics
web application uses the views to expose the mappings to the user.

The views contain the following:

Column Name Description


ViewName + ‘Id’ Is a surrogate key and used internally in Insight and cannot
be updated.
ForeignKey This is only updated when adding a new row – refer to
existing row to determine the correct foreign key. This
column is not to be changed for existing rows in the view.
Source columns, field names as Source columns refer to one or more columns specified in
specified in [SourceColumn]. For each unique combination of values in
CustomColumn..SourceColumn the source columns, a row is automatically added to the
e.g. ProductCode view. The source columns are provided as a reference for
the end user to make updates to Custom Columns and are
not to be updated for existing rows.
Custom Columns, field names as Custom columns refer to one or more columns specified in
specified in [CustomColumn] above. For existing rows, the custom
CustomColumn..CustomColumn columns are the only columns that should be updated (when
E.g. Category, Classification and an update is required). When a row is initially added, if a
ProductDesc value is present in @StagingTable it will be set to that value
otherwise it will be set to ‘N/A’. After initially being added,
values stored in this view’s column supersede values in
@StagingTable.

V_ProductClassification Example
This demonstrates a Business Rule with type set to Lookup.

There can be many codes that identify a product in a source system and the codes can be cryptic. A
ProductClassification view is set up in InsightETL to group these codes into manageable categories for
analysis such as ‘Savings’, ‘Term Deposits’, ‘Mortage’. The following picture shows how the Product
Classification business rule appears in the RuleDefinitions table. Note the Operation column set to Lookup
and the name of the associated view defined in the TargetViewName.

Page 135 | 335


Advanced Analytics Platform Technical Guide

Figure 20 - RuleDefinitions definition for Product Classification business rule

The figure below shows what the corresponding records in the RuleValues table look like, instead. Please
note that the RuleDefinitionId for all the records shown matches with the RuleDefinitionId in RuleDefinitions
table above.

Figure 21 - RuleValues entries for Product Classification business rule (partial sample)

In the following picture, we can see how the rule designer configuration is reflected in the Product
Classification view.

Figure 22 - v_ProductClassification view output (partial sample)

The Analytics ETL process would populate ProductCode and the Users populate the other columns.
Analytics ETL will then add the mapped columns to the StagingAccount table.

V_ActiveAccount
This view is used by the system to set whether an account is Active in the system. It sets the IsActive flag
based on ClosedDateStatus, BalanceStatus, and AccountStatus.

V_SystemParameters
This view is used to provide custom run parameters were they needed for Currency, Date selection and
other custom requirements in InsightStaging.

Page 136 | 335


Advanced Analytics Platform Technical Guide

V_ SystemParametersLanding
This view is used to provide custom run parameters were they needed for Currency, Date selection and
other custom requirements in InsightLanding.

V_COA
This view is used to provide Chart of Account mappings for the GL objects and financial reports.

Other views
The views above (and others) are provided by default, but each installation can add additional ones or alter
the existing views as needed.

v_AllLog

This view joins all aforementioned logging tables together, providing detailed info in the order of event
timeline.
Due to the amount of data returned, the user should try to use other filtered views first to obtain info in
interest.

v_ActiveLog

This view is based on InsightETL.dbo.v_AllLog. It returns only the logs related to the currently ACTIVE
batch.

v_ErrWarn3

This view is based on InsightETL.dbo.v_ActiveLog. It returns only the error logs (including ‘Warning’s) and
adjacent logs (one record before and one record after the error log).
v_ConvertCDPRules
This GDPR-related view maps the content of the Temenos Core Banking CDP_DATA_ERASED_TODAY table
to the InsightETL RuleDefinitions and RuleColumns column names.

v_ConvertCDPRuleCustomers
This GDPR-related view maps the content of the Temenos Core Banking CDP_DATA_ERASED_TODAY and
CUSTOMER_ACTIVITY tables to RuleCustomers column names.

v_ConvertCDPRuleCustomersRuleColumns
This GDPR-related view maps the content of the Temenos Core Banking CDP_DATA_ERASED_TODAY to
RuleCustomersRuleColumns column names

v_ConvertCDPAnalyticsRulesDimensions and v_ConvertCDPAnalyticsRulesFacts


This GDPR-related view maps the content of the InsightWarehouse..Datadictionary CDP columns to
RuleDefinitions and RuleColumns column names

V_AgeGroup (deprecated)
This view was an example of a type 2 master data view where the customer age in banded into age groups
for easier reporting. Each client would configure the age ranges for their reporting needs.

Page 137 | 335


Advanced Analytics Platform Technical Guide

Stored Procedures and functions


dbo.s_CreateRulesFromUI
Description
This stored procedure was included in InsightETL from R18 and it is part of the Rule Engine functionality.
It calls the JSON parsing functions, then accepts JSON files from the front end and finally inserts JSON data
into the following tables to either update or create a new rule14:

 RuleDefinitions
 RuleColumns
 RuleExpressionLevels
 RuleValues

The dbo.s_LoadTemporalTables stored procedure (discussed next in this section) is doing the work of
populating the tables, either temporally or not. Foreign keys between the tables are also populated by this
procedure.

Inputs
 @Json - JSON file passed from the Front end User Interface. The data type for this parameter is
Nvarchar(Max)
 @UserName – User name associated with the Analytics user who inputted the rule.
 @BusinessDate – Business date in which the new rule was inputted

Outputs
 @RuleDefinitionIdMessageOut – returned output parameter in the format <success or failure
code>:< RuleDefinitionID > . The success of failure code can have two values i.e. 1 for error and
2 for success. If an error is encountered, further details will outputted in the ErrorMessageOut
parameter E.g. if the RuleDefinitionIdMessageOut is ‘1:1234’, the code 1 means the rule data was
successfully added but the rule did not run due to an error and the RuleDefinitionID is 1234. If the
message returned is ‘2:1234’, the code 2 means the rule 1234 was created and it ran successfully.
 @ErrorMessageOut – Details of the error message, if any.

Example Call
DECLARE @p3 NVARCHAR(100)
SET @p3=NULL
DECLARE @p4 NVARCHAR(max)
SET @p4=NULL

EXEC S_createrulesfromui
@Json=

14To achieve the update, data is passed on to the following functions:


dbo.fn_GetFromJson_RuleDefinitions, dbo. fn_GetFromJson_RuleColumns, dbo.
fn_GetFromJson_RuleExpressionLevels and dbo. fn_GetFromJson_RuleValues. These are documented
further on in this chapter and section.
Page 138 | 335
Advanced Analytics Platform Technical Guide

N'[--See appendix for full example.


]'
,
@UserName=N'Admin@temenos.com',
@RuleDefinitionIdMessageOut=@p3 output,
@ErrorMessageOut=@p4 output

SELECT @p3 as RuledefinitionIDMessageOut,


@p4 as ErrorMessageOut

dbo.s_LoadTemporalTables
Description
This stored procedure loads a view or table into another table using a dynamic SQL merge. It can match
on the table’s surrogate key or a hashed natural key. It can load temporally or not temporally (where
historical changes are preserved).

Inputs
 @SourceDataBaseName - The database in which the source table or view is.
 @SourceSchemaName – The schema of the source table or view to be loaded into a target
table.
 @SourceTableName - The name of the source table or view to be loaded into a target table.
 @TargetDataBaseName – The database in which the target table or view is, i.e. of the table or
view that is going to be loaded with source data.
 @TargetSchemaName – The schema of the target table or view to be loaded into a target table.
 @TargetTableName - The name of the target table or view to be loaded into a target table.
 @BusinessDateFrom - If the source data has the IsTemporal column set to 1 then the value of
this parameter will be the businessdate of the current load. Else this parameter will be set to 1900-
01-01.
 @BusinessDateTo – Business date in which the new rule will expire. This parameter is set to
9999-12-31 for new records. For updated records, if IsTemporal column’s value is 1, this
parameter’s value will be set to the current BusinessDate.
 @BusinessDate - The current business date.
 @RowHashColumnName - The name of the binary(20) column with row hash value used to
compare source and target records. Source and Target Tables need to have a RowHash column
for this StoredProcedure to work.
 @SourceNaturalKeyColumnName - The name of the natural key (of type binary(20)) of the
target table.
 @SurrogateKeyColumnName - The name of the surrogate key of the target table.
 @AllowDelete - If set to 1, this parameter allows IsActive to be set to 0 when the
NotMatchedBySource condition is met.
 @UserName – User name associated with the Analytics user who inputted the rule.

Page 139 | 335


Advanced Analytics Platform Technical Guide

 @MatchColumn Int – This parameter is used for choosing the matching column to use for
the dynamic merge. Acceptable values are: 1 = Match on the Target tables surrogate key, eg.
RuleDefinitionId and 2 = Match on the Target tables sourceNaturalKey, eg. SourceRuleDefinitionId.
 @ParentSurrogateKeyColumnName – The name of the parent table surrogate key, eg.
RuleDefinitionId.
 @Debug – This parameter should be set to 1 for debug information to be shown when the
procedure is running in SSMS.

Outputs
 @ParentSurrogateKeyColumnMessageOut – This parameter stores message for consumption
by the front end. It consists of the concatenation of two parameters i.e. a SuccessFail code (i.e. 1
or 0) and the SurrogateKeyColumnID of the Created record. Eg. 1:1234.
 @ParentSurrogateKeyColumnValueOut Int - The surrogate key value of the parent target
table.
 @ErrorMessageOut nvarchar(Max) - The error message, if applicable
 @SuccessFailOut – Success or fail code i.e. 1 for success or 0 for fail.

Example Call
Exec s_LoadTemporalTables
@SourceDataBaseName,
@SourceSchemaName,
@SourceTableName,
@TargetDataBaseName,
@TargetSchemaName,
@TargetTableName,
@BusinessDateFrom,
@BusinessDateTo,
@BusinessDate,
@RowHashColumnName,
@SourceNaturalKeyColumnName,
@SurrogateKeyColumnName,
@AllowDelete,
@UserName,
@MatchColumn,
@ParentSurrogateKeyColumnName,
@Debug,
@ParentSurrogateKeyColumnMessageOut = @RuleDefinitionIdMessageRuleDefinitions output,
@ParentSurrogateKeyColumnValueOut = @ParentSurrogateKeyColumnValueOut output,
@ErrorMessageOut = @ErrorMessageOut output,
@SuccessFailOut = @SuccessFailOut output;

dbo.s_CreateRuleGroup
Description
This stored procedure looks after the execution of the rules defined through the Rule Engine. It reads the
configuration tables for RuleDefinitions, RuleColumns, RuleExpressionLevels, RuleValues and RuleFilters
and creates a new base table or view with columns added as a result of defined rules.

Page 140 | 335


Advanced Analytics Platform Technical Guide

s_CreateRuleGroup is executed in the InsightStaging..s_Generic_Extract and


InsightStaging..s_Generic_Transform stored procedures (that will be discussed in the InsightStaging
chapter) or within the Analytics ETL agent job.

Inputs
 @DataBaseName - The database in which the rule is.
 @TableName - The name of the table storing the rule.
 @SchemaName – The schema of the table storing the rule

 @ExecutionPhase - core Analytics ETL phase in which the operation is taking place (only
applicable to Dataset operations)
 @ExecutionStep - core Analytics ETL step in which the operation is taking place (only applicable
to Dataset operations)
 @BusinessDate - The current business date.
 @IsPersisted – If set to 1, the rule will add a physical column to a base table. If set to 0, the rule
will create a view around a base table with the column(s) added to the view.
 @RuleDefinitionId – RuleDefinitionId value

 @BatchNum – Batch number used to execute the stored procedure.


 @StagingEventLogID – Id for the logging table

Outputs
 @RuleDefinitionIdMessageOut – returned output parameter in the format <success or failure
code>:< RuleDefinitionID > . The success of failure code can have two values i.e. 1 for error and
2 for success. If an error is encountered, further details will outputted in the ErrorMessageOut
parameter E.g. if the RuleDefinitionIdMessageOut is ‘1:1234’, the code 1 means the rule data was
successfully added but the rule did not run due to an error and the RuleDefinitionID is 1234. If the
message returned is ‘2:1234’, the code 2 means the rule 1234 was created and it ran successfully.
 @ErrorMessageOut nvarchar(Max) - The error message, if applicable
 @SuccessFailOut – Success or fail code i.e. 1 for success or 0 for fail.

dbo.s_DQBulkCopyCSV
Description
This SQL-CLR procedure is used to perform multithreaded Data Quality Import process, using CSV files as
sources. It is internally called by InsightImport.Insight.s_DQImportTable during Analytics ETL.

Inputs
 @SourceCSVType – Type of Source CSV file used.
 @TargetTableName – Name of the Target Table.
 @IsIdentityInsertOn – Defines whether explicit values can be inserted into the identity column
of the table (1) or not (0)
 @ParallelismDegree – The maximum number of worker threads that can be used in parallel.
The default setting is -1, which means to determine automatically.

Page 141 | 335


Advanced Analytics Platform Technical Guide

 @Encoding – Character encoding. For Temenos Core Banking CSVs, the default value for
@Encoding parameter is set as 1683 for UTF-16 Big Endian. Other supported values could be
7(UTF-7), 8(UTF-8), 1673(UTF-16LE), 3273(UTF-32LE), 3283(UTF-32BE) or 18030(GB-18030).

Outputs
 @xmlLog - The details of multithreading info are outputted from the SQL-CLR stored procedure
into an XML variable.
 @TotalRowsTransferred – Total number of rows transferred
 @RowsRevised – Total number of rows revised
 @FieldsRevised – Total number of columns revised

dbo.s_DQParseMulti-value
Description
This SQL-CLR procedure is used to perform multithreaded multi-value parsing with Data Quality, using the
corresponding base table as source. It supports parsing all types of ‘sub’ tables, including LR, LRSV, MV
and MVSV.

Inputs
 @TargetTableName – Full name of the target sub table.
 @IsIdentityInsertOn – Defines whether explicit values can be inserted into the identity column
of the table (1) or not (0)
 @ParallelismDegree – The maximum number of worker threads that can be used in parallel.
The default setting is -1, which means to determine automatically.

Outputs
 @xmlLog - The details of multithreading info are outputted from the SQL-CLR stored procedure
into an XML variable.

dbo.s_ParallelExecActionQueriesTx
Description
This SQL CLR stored procedure is executed and executes in parallel a series of T-SQL action queries within
each of the aforementioned ETL stored procedures. Action queries are used to copy or delete source system
data, create new tables and update other queries during Analytics ETL processing.
s_ParallelExecActionQueriesTx also outputs detailed logs, including threading information, warning, any
encountered errors and the outcome of each task by individual action queries monitored.

To better understand how this process works, let us take the example of the s_ImportTable_Update stored
procedure in InsightImport. The core action to be performed by this routine is considered an individual
action query i.e. the bulk insert statement which populates Import tables from CSV files.

When s_ParallelExecActionQueriesTx is called within the s_ImportTable_Update stored procedure, the bulk
import statement within the Import Table Update procedure is assigned to the @xmlQueries input
parameter of the muti-threading SQL CLR procedure as follows.

As we can see the SELECT statement used to assign a value to @xmlQueries is followed by three elements
aliased as ActionQuery, Target, and Source. The ActionQuery column contains, as text, the action query to
be executed in parallel. The other two elements, which are also required by @xmlQueries, will be
explained in the Inputs section below.

Page 142 | 335


Advanced Analytics Platform Technical Guide

SELECT @xmlQueries = (
SELECT
N'BULK INSERT [InsightImport].[dbo].[v_' +
@TableName + '] FROM ' + QUOTENAME(@PathName + '\' + BIFileName + '.csv', '''') +
N' WITH ( FIRSTROW=2, FIELDTERMINATOR=' +
QUOTENAME(CHAR(126), '''') + ', ROWTERMINATOR=''\n'', CODEPAGE=''1201'',
DATAFILETYPE=''WideChar'', MAXERRORS = 0, TABLOCK );'
AS [ActionQuery],
@TableName AS [Target],
@InvolvedModule AS [Source]
FROM (
SELECT DISTINCT BIFileName,
LTRIM(RTRIM(TableName)) AS TableName
FROM [Insight].[v_ImportFileList]
WHERE TotalRecords > 0
AND 1 = CASE WHEN @TableName = N'ALL' THEN
1
WHEN TableName =
Replace(@TableName,'_tmp','') THEN 1
ELSE 0
END
) AS A
ORDER BY BIFileName DESC
FOR XML RAW('Task'), ROOT('Root'), ELEMENTS),
@sql = CONVERT(NVARCHAR(MAX), @xmlQueries);

This action query is composed of multiple tasks as the above action query has to be started, executed and
completed for every individual object (i.e. table) to be processed. Each individual task in the action query
will be carried out via a separate connection by a randomly assigned individual thread, which runs in the
background. In addition to this, other two input parameters will control the maximum number of threads
that can run concurrently and whether to stop or continue the process if one of the tasks executed fails
while the other are still running.

The execution order of tasks by each thread is determined based on workload and resource availability and
it normally happens randomly, which is acceptable for the data processing required in Import, Landing and
Source databases as no dependencies or special ordering are required for creating their tables and loading
data into them.

The output of each task executed by a thread will be stored in a log in XML format, which can be parsed
and saved to the logging tables in the InsightETL database.

Transactional Multi-threading
This kind of multi-threading configuration could potentially generate database deadlocks if a block of
interwoven T-SQL statements are multithreaded instead of multithreading at the finest granular level (in
the latter scenario, data-accessing conflicts are much easier to handle).

To avoid this kind of issues, s_ParallelExecActionQueriesTx has the ability to do transactional multithreading
(thus has the ‘Tx’ suffix which standing for transaction), meaning that locks are applied on the objects
being manipulated only for the duration of an individual transaction and not while the entire stored
procedure or Analytics ETL job are being executed – this will ease compliance with ACID standards (i.e.

Page 143 | 335


Advanced Analytics Platform Technical Guide

Atomicity, Consistency, Isolation, and Durability), if required. Details of how the transaction duration can
be configured through a series of optional input parameter that will be discussed in the next section.

Inputs
 @xmlQueries - This input parameter stores the action query to be multi-threaded by
s_ParallelExecActionQueriesTx. All concurrent T-SQL tasks are described and submitted to the
procedure in this XML parameter. Each task consists of three elements including:
1. ActionQuery
This element is identified by a keyword either tagged as <Action>…</Action> or
<Query>…</Query>. The task is represented by one or multiple T-SQL statements listed
between the tags. If there are multiple statements, they must be delimited by
semicolons.

Supported T-SQL statements include most of the action queries which do not return row
sets. For each task, the number of rows affected is based on the last T-SQL statement of
the task.

2. Target
This is the object to be targeted e.g. the name of the table to be loaded into a database.
This optional element is identified by a pair of XML tags <Target>…</Target>. It helps
logging be more self-explanatory if the targeted database entity is clearly stated.

3. Source
This is the Source Procedure and this optional element is identified by a pair of XML tags
<Source>…</Source>. The name of calling procedure can be listed here for each task.

 @ParallelismDegree - The maximum number of worker threads that can be used in parallel.
The default setting is -1, which means to determine automatically.

 @ContinueWithError - When the cancellation feature is enabled (i.e., this parameter is set to
0, i.e. False), as soon as the first error is encountered in any thread, other running tasks can be
immediately aborted and the remaining tasks still pending will be canceled; on the other hand, if
cancellation feature is disabled, all the tasks will be carried out all the way independently to the
end, regardless the outcome of other tasks. The default setting is True (1).

 @UseTransactionalThread: Every single thread carrying out the tasks can be set as
transactional or not. The default setting is 0, i.e.False.

 @TransactScopeOption-- The default value is 0. If threads are configured as transactional


(@UseTransactionalThread = 1), then there are three options to assign a controlling scope for
the transaction. Transaction Scope marks the boundaries of a code block as participating in a
transaction. Each option is described as follows:
o Suppress (@TransactScopeOption = 0) - The ambient transaction context is suppressed
when creating the scope. All operations within the scope are done without an ambient
transaction context. In another word, the suppress option is equivalent to non-
transactional threading (i.e., @UserTransactionalThread = 0), the even thread is nested
within an outer transaction.
Page 144 | 335
Advanced Analytics Platform Technical Guide

o Required (@TransactScopeOption = 1) - A transaction is required by the scope. It uses


an ambient transaction if one already exists. Otherwise, it creates a new transaction
before entering the scope
o RequireNew (@TransactScopeOption = 2) A new transaction is always created for the
scope.

It is worth noting that this option is for the inner transaction that a worker thread directly runs
within. As for the possible ambient transaction controlling from outside of the SQL-CLR stored
procedure, it is always suppressed.

 @IsolationLevel -- When using transactional threads, Isolation Level determines how


transaction integrity is visible to other worker threads. The default parameter value is 2 for
ReadCommitted. It can be set as one of the following options:
o ReadUncommitted (@IsolationLevel = 1);
o ReadCommitted (@IsolationLevel = 2);
o RepeatableRead (@IsolationLevel = 3);
o Serializable (@IsolationLevel = 4);
o Snapshot (@IsolationLevel = 5)

Note that ‘Chaos’ mode is not supported in SQL Server Database Engine, therefore
‘@IsolationLevel = 0’ is not a valid option in the SQL-CLR stored procedure.

 @TransactTimeout - Transaction timeout is measured in seconds. It’s defaulted to 10800


seconds (e.g., 3 hours) in this procedure. Any single transactional task lasting longer than this
time threshold will be aborted and consequently create a TransactionAbortedException in the
multithreading log.

Outputs
 @xmlLog -- The details of multithreading info are outputted from the SQL-CLR stored procedure
into an XML variable.
This XML output parameter can be parsed and shredded into a table by utilizing the helping
table-valued function dbo.fn_GetXmlLogFromParallelExecActionQueriesTx. Data logged
by this function based on the content of the @xmlLog output parameter will be discussed in the
section dedicated to the logging tables.

Examples of Non-transactional vs Transactional multi-threading


 Example 1: Non-transactional multithreading since BULK INSERT has already internally applied
its own transaction control for each batch
USE InsightImport;
DECLARE @xmlTasks xml, @spReturn int, @xmlLog xml;
-- Prepare the tasks in XML, including object targeted and the calling procedure name
SELECT @xmlTasks = (
SELECT

Page 145 | 335


Advanced Analytics Platform Technical Guide

'BULK INSERT [InsightImport].[dbo].[v_' + TableName + '] FROM ' +


QUOTENAME('C:\TEST\' + BIFileName + '.csv', '''') + ' WITH
(FIRSTROW=2,FIELDTERMINATOR=''~'''+',ROWTERMINATOR=''\n'',CODEPAGE=''1201'',DATAFILETYP
E=''WideChar'',MAXERRORS = 0,TABLOCK);' AS [ActionQuery],
TableName AS [Target],
's_ImportTable_Update' AS [Source]
FROM (
SELECT DISTINCT
BIFileName, LTRIM(RTRIM(TableName)) AS TableName
FROM [Insight].[v_ImportFileList]
WHERE TotalRecords > 0) AS A
ORDER BY BIFileName DESC
FOR XML RAW('Task'), ROOT('Root'), ELEMENTS);

-- Run in parallel with non-transactional threads


EXEC @spReturn = [Insight].dbo.s_ParallelExecActionQueriesTx
@xmlLog = @xmlLog OUT,
@xmlQueries = @xmlTasks,
@ParallelismDegree = -1, -- Auto-detect the best number of worker threads
@ContinueWithError = 0x1, -- Failure in a thread cannot cancel other threads
@UseTransactionalThread = 0x0; -- Thread is not transactional

 Example 2: Transactional multithreading, allowing threads to access uncommitted data in


different area from the transaction log, thus database deadlocks are avoided in the concurrent
process

USE InsightImport;
DECLARE @xmlTasks XML, @xmlLog XML, @sql NVarChar(max), @spReturn Int,
@Info NVarChar(max), @EventLogID Int;

SELECT
@xmlTasks = (
SELECT
CONCAT('EXEC InsightImport.Insight.s_Attributes_Update @EntityID = ', EntityID,
', @BatchNum = 5678') AS [ActionQuery],
Name AS [Target],
's_Import_Control' AS [Source]
FROM InsightImport.Insight.Entities
ORDER BY EntityID
FOR XML RAW('Task'), ROOT('Root'), ELEMENTS),
@sql = convert(nvarchar(max), @xmlTasks);

IF @xmlTasks IS NOT NULL BEGIN


-- Run in parallel with transactions and ReadUncommitted isolation
EXEC @spReturn = Insight.dbo.s_ParallelExecActionQueriesTx

Page 146 | 335


Advanced Analytics Platform Technical Guide

@xmlLog = @xmlLog OUT,


@xmlQueries = @xmlTasks,
@ParallelismDegree = 8, /* Using up to 8 worker threads */
@ContinueWithError = 1, /* Cancellation is off */
@UseTransactionalThread = 1,
/* Individual thread is transactional, otherwise deadlocks can occur */
@TransactScopeOption = 2,
/* 0: Suppress; 1: Required; 2: RequiresNew */
@IsolationLevel = 1;
/* 1:ReadUncommitted; 2:ReadCommitted; 3:RepeatableRead; 4:Serializable; 5:Snapshot */

-- Logging
SET @Info = '[Multithreading]: ' +
iif(@spReturn = 0,
'Successfully updated Attributes ',
concat(@spReturn, ' error(s) encountered while updating Attributes '));

EXEC @EventLogID = dbo.s_EventLog_Add


@Batch = 5678,
@Severity = 'Notification',
@SourceProc = 's_Import_Control',
@ObjectTargeted = 'All tables in Entities',
@ElapsedTime = null,
@Info = @Info,
@InfoDetails = null,
@SqlStatement = @sql,
@MultithreadingLog = @xmlLog; /* XML log shredded into table */
END

dbo.s_ExecSingleActionQuery
Description
This stored procedure is part of the multi-threading or paralleling process. It is used for creating primary
keys in parallel when InsightSource.dbo.s_InsightSource_Update is run. It is particularly useful because it
is able to return any error message thrown by the SQL Server Data Engine and it is consequently used in
the batch control process, to update the Batch table in InsightETL.

s_ExecSingleActionQuery can run a T-SQL action query in either async or sync mode within CLR and returns
all error messages in case a thread fails, so this stored procedure can be used also for logging purposes.

In the async mode, the action query is run by an auxiliary worker/background thread and the main thread
does not stop to wait for the auxiliary worker to complete its thread. On the contrary, the main/calling
thread continues to the next task (for example to create other PKs for other tables) as soon as the auxiliary
worker thread is started. The worker thread runs in async fashion and will report exceptions to the main
thread once it returns.

Page 147 | 335


Advanced Analytics Platform Technical Guide

In the sync mode, the main thread will wait for the auxiliary thread to complete its task before moving on
to the next one. However, s_ExecSingleActionQuery will report exceptions in the same way as in the async
mode.

In spite of this procedure being able to return all error messages, the built-in TRY-CATCH in T-SQL syntax
only captures the very last error in its ERROR_MESSAGE() function and miss the first error message.

dbo.s_Parse_XML
Description
This stored procedure is used to load XML data into tables, normally during the post-deployment process.

dbo.s_MultiStringSplitToRows
Description
This CLR stored procedure split multi-value columns and distributes them into sequential rows. Columns
are typed as specified in the parameter @MultValueColumnSchemaList. This procedure is called during the
Import process by s_T24AllMulti-value_Add which in turn is executed within s_Import_Control in
InsightImport.

Inputs
 @InputString – string to be inputted, varchar(max)
 @Delimiter – delimiter to be used, varchar(1)
 @Multi-valueColumnSchemaList – list of multi-value columns, varchar(max)
 @TypeofScriptOutput – type of output scritp, int

dbo.s_StringSplitToColumns
Description
This CLR stored procedure splits a local ref column into multiple columns. Columns are either string typed
or other data types as specified by the parameter @LocalRefSchemaList. This procedure is called during
the Import process by s_T24AllMulti-value_Add which in turn is executed within s_Import_Control in
InsightImport.

Inputs
 @InputString – string to be inputted, varchar(max)
 @Delimiter – delimiter to be used, varchar(1)
 @LocalRefSchemaList – list of Local ref schemas, varchar(max)
 @LocalRefColumn – list of Local ref columns, varchar(128)

Page 148 | 335


Advanced Analytics Platform Technical Guide

dbo.s_CreateColumnCalculations
Description
The stored procedure carries out the split, calculation and dateset operations described in the
AttributeCalculations table. It can run manually for testing purposes or it can be included into the Analytics
ETL SQL Server Agent job or script in the appropriate position. E.g. In the case the Splits and Calculations
entries on Imported Data are used to make later Extract logic in v_source views simpler and faster, the call
would be placed after all the InsightImport steps.

Inputs
 @DatabaseName – name of the database for which the stored procedure will be executed
 @TableName– name of the table for which the stored procedure will be executed
 @SchemaName – name of the schema of the table for which the stored procedure will be
executed
 @ExecutionPhase - core Analytics ETL phase in which the operation is taking place (only
applicable to Dataset operations)
 @ExecutionStep - core Analytics ETL step in which the operation is taking place (only applicable
to Dataset operations)
 @BatchNum – Batch number used to execute the stored procedure.

dbo.s_CreateIndexes
Description
The stored procedure creates additional indexes, as per what defined in the Indexes configuration table. It
can run manually for testing purposes or it can be included into the Analytics ETL SQL Server Agent job or
script in the appropriate position. E.g. If additional indexes are required to be placed on foreign keys in the
InsightSource database, the call would be placed after InsightSource update and before InsightStaging
Update.

Inputs
 @TableName– name of the table for which the stored procedure will be executed
 @DatabaseName – name of the database for which the stored procedure will be executed
 @TenantId – Id of the database tenant
 @BatchNum – Batch number used to execute the stored procedure.

dbo.s_PopulateAuditCounts
Description
This procedure populates the dbo.TableRowCountAudit log with the record counts for the databases
involved in ETL (e.g. InsightImport, InsightLanding etc.). This stored procedure is executed several times
during various stages of ETL.

Inputs
 @DatabaseName – name of the database for which the stored procedure will be executed e.g.
InsightImport

Page 149 | 335


Advanced Analytics Platform Technical Guide

 @ExtractListSourceName – name of the source system for which the stored procedure will be
executed e.g. BS
 @SchemaName – name of the schema of the table for which the stored procedure will be
executed e.g. dbo
 @TableType – Type of Table for which the stored procedure is executed e.g. ‘Hash Table’, 'DMV',
'Select' etc.
 @ETLPhase – name of the ETL Phase for which the stored procedure will be executed e.g. dbo
 @TableName – name of the table for which the stored procedure will be executed. ‘All’ keyword
is acceptable

dbo.s_EventLog_Add
Description
This stored procedure adds new rows to the EventLog table.

dbo.s_EventLogDetails_Add
Description
This stored procedure adds new rows to the EventLogDetails table.
dbo.s_MergeUpdateBatchStatus
Description
This stored procedure creates a new batch in the Batch table with a new batch number or it continues to
use the current active batch and performs in-place updates to the involved Batch record.

dbo.s_SetBatchStatusFinished
Description
This stored procedure marks the completion of any batch executed, either ‘CompletedWithError’ if there
are error(s) or ‘CompletedSuccessfully’ when no error is encountered.

dbo.s_StagingEventLog_Add
Description
This stored procedure adds new rows to the StagingEventLog table.

dbo.s_StagingEventLog_Update
Description
This stored procedure in-place updates to the rows in the StagingEventLog table.

dbo.s_StagingEventLogDetails_Add
Description
This stored procedure adds new rows to the StagingEventLogDetails tables.

dbo.s_InsightETL_Purge
Description
This stored procedure purges the content of the Log tables within the InsightETL database i.e. EventLog,
EventLogDetails, StagingEventLog and StagingEventLogDetails.

Page 150 | 335


Advanced Analytics Platform Technical Guide

dbo.s_InsightETL_RangePurge
Description
This stored procedure internally calls the s_InsightETL_Purge stored procedure to purge Log tables within
a certain date range.

dbo.s_ColumnStoreIndex_Defragmentation
Description
This stored procedure is used for Index defragmentation in the Insightlanding database. Even if this
procedure is stored InsightETL, it operates in InsightLanding and it is described in detail in a dedicated
section of the chapter of this document dedicated to InsightLanding.

dbo.s_LoadCDPRules
Description
This stored procedure loads General Data Protection Regulation / Customer Data Protection metadata from
Temenos Core Banking into the Analytics rules engine so that update statements can be generated to erase
certain customer attributes. More details about the usage of this and other CDP-related procedures is
provided within the chapter of this document dedicated to General Data Protection Regulation.

dbo.s_LoadCDPAnalyticsRules
Description
This procedure loads metadata from InsightWarehouse datadictionary into the Analytics Rules Engine for
the purposes of Right to Erasure for CDP/GDPR. This procedure is for Analytics data erasure in
InsightWarehouse, as opposed to raw Temenos Core Banking data in InsightLanding. The following tables
are loaded:

CDPPurpose

RuleDefinitions

RuleColumns

RuleCustomersRuleColumns --it's assumed that the customers will be loaded already from Temenos Core
Banking sources.

dbo.s_CreateCDPUpdateLogic
Description
This procedure produces an update statement given a list of columns.

dbo.s_ExecuteCDPRules
Description
This procedure executes the CDP-related rules and logs the results.

dbo.s_CreateCDPUpdateStatements
Description
This procedure creates update statements for CDP-related rules and logs the results.

Page 151 | 335


Advanced Analytics Platform Technical Guide

Online.s_OnlineUpdateBatchStatus
Description
This stored procedure creates a new batch in the Online.OnlineBatch table with a new Online batch number
or it continues to use the current active batch and performs in-place updates to the involved Online Batch
record.

dbo.fn_GetFromJson_RuleDefinitions
Description
This table-valued CLR function accepts a JSON file and returns a dataset to load into RuleDefinitions.

Example Call
declare @json nvarchar(max)=
N'[--JSON in here--
]

';
Select *
From InsightETL.dbo.fn_GetFromJson_RuleDefinitions(@json);

dbo.fn_GetFromJson_RuleColumns
Description
This table-valued CLR function accepts a JSON file and returns a dataset to load into RuleColumns.

dbo.fn_GetFromJson_RuleExpressionLevels
Description
This table-valued CLR function accepts a JSON file and returns a dataset to load into RuleExpressionLevels.

dbo.fn_GetFromJson_RuleValues
Description
This table-valued CLR function accepts a JSON file and returns a dataset to load into RuleValues.

dbo.fn_GetXmlLogFromParallelExecActionQueriesTx
Description
This Insight function is used to populate the EventLog and EventLogDetails tables in InsightETL with the
output logs created by the threads which execute parallelized action queries during Analytics ETL. In other
words, each stored procedure relying on Analytics enhanced multi-threading will log its activity in InsightETL
through this Insight function.

Specifically, dbo.fn_GetXmlLogFromParallelExecActionQueriesTx parses the content of the @xmlLog


output parameter of s_ParallelExecActionQueriesTx. This parameter contains the multithreading log in XML
format. Once the Xml string is shredded, the aforementioned function will also save to the InsightETL
logging tables.

Page 152 | 335


Advanced Analytics Platform Technical Guide

dbo.fn_GetCurrentBatch
Description
This scalar-valued multi-threading-related function assigns activated batches to execute tasks.

dbo.fn_GetJsonErrMsg
Description
This table-valued function can be used with the APPLY operator to format the JSON error messages in the
ETL_DQ_ErrorMessage column and output them as separated fields

dbo.fn_GetMultithreadingLog
Description
This table-valued function is used to display or archive threading logs.

This stored procedure is used for InsightLanding index maintenance. More information is provided in the
Maintence section of the InsightLanding chapter.

Online.fn_OnlineGetCurrentBatch
Description
This scalar-valued function returns the Online Batch Id of the active Online micro batch.

s_IMD_AllViews (deprecated)
In pre-R18 releases, this InsightMasterData stored procedure created views for all Type 1 and Type 0
master data records by joining the CustomColumn table to the CustomValues table on the appropriate
CustomColumnID.

s_IMD_Views (deprecated)
In pre-R18 releases, this InsightMasterData stored procedure created a view for a particular
CustomColumnID.

Input
CustomColumnID – The id of the CustomColumn table for which a view needs to be created

InsightStaging..s_Transform_MD (deprecated)
In pre-R18 releases, this stored procedure (in the InsightStaging database) updated staging tables and fact
tables based on the MasterData stored in the CustomColumn and CustomValue tables.

Inputs
 TableName – Name of the InsightStaging table to be updated (e.g. StagingAccount)
 ETL Phase – ETL phase in which the update should take place (i.e..Extract or Transform)
e.g.: Exec s_transform_MD ‘StagingAccount’, ‘Extract’

Page 153 | 335


Advanced Analytics Platform Technical Guide

s_InsightEditor_TableLoad (deprecated)
In pre-R18 releases, this stored procedure was used by the DataManager application to load tables.

Input:
 Schema – Schema of the table to be loaded into Data Manager
 Table – Name of the table to be loaded into Data Manager
 Where – Any filters (i.e. where clause) to be applied to the data loaded
 OrderBy – Sorting order (i.e. order by clause) to be applied to the data loaded

s_Translation_Check (deprecated)
In pre-R18 releases, this stored procedure checked the quality of Translations data.

Output
 List of Item Names that have different Translation across same or different object types
 List of Column Names in the translation table that are not present in the data dictionary
 List of Column Names in the Data Dictionary table that do not a have corresponding translation
 List of Cube attribute or measures that do not belong to any display folder - deprecated

s_ParallelExecActionQueries (deprecated)
Description
This is the previous version of s_ParallelExecActionQueriesTx, mainly used in R16 but still available for
backward compatibility.

It has the ability to capture errors happened in other threads; to cancel tasks scheduled in other threads,
and control the maximum number of worker/background threads (e.g. the parameter
@MaxLogicalCpuNumber or in the newer version @ParallelismDegree). However, it has no ability to perform
transactional multithreading.

s_ExecuteInParallel (deprecated)
Description
This is a very early version of multithreading CLR procedure which became obsolete after R15.

s_ParseMultiStringToColumns (deprecated)
Description
This stored procedure was used for parsing tables with multi-values and sub-values in pre-R15 releases
and is still available for backward compatibility. It is now replaced by the s_MultiStringSplitToRows stored
procedure in Insight MasterData.

Page 154 | 335


Advanced Analytics Platform Technical Guide

s_ParseStringToColumns (deprecated)
Description
This stored procedure was used for parsing tables with local references in pre-R15 releases and is still
available for backward compatibility. It is now replaced by the s_StringSplitToRows stored procedure in
Insight MasterData.

s_Translate_Report (Deprecated)
Description
This is used by the translator module to add the translated report labels to the Reporting Services reports.

Inputs:
Smexml xml

ssasName nvarchar(100)

s_Translate_Cube (Deprecated)
This is used by the translator module to translate cube labels.

Inputs:
Smexml xml

ssasName nvarchar(100)

ssasDBName nvarchar(100)

Triggers
Online. Tr_OnlineBatch_AfterIU_DeactiveOldBatches
Description
This trigger on the Online.OnlineBatch table updates all other old Online batch records to be inactive except
the last one inserted/updated

Configuration
This section describes how the various functions in InsightETL should be configured.

Analytics ETL System Dates


In order for the Analytics ETL process to run, the InsightETL CurrentDate table should contain the
snapshot date of the data being processed.

InsightETL SourceDate needs to contain a record for each of the dates that Analytics ETL will be run for,
for each source system.

This table maps the date of the source system data (Source Date) to a Business Date. These dates could
be different, this is a common occurrence when two source systems are loaded, one with a daily frequency
and one with a monthly frequency. The latest available date for the monthly source system would be used
for every business date until the next month’s data is available.

Page 155 | 335


Advanced Analytics Platform Technical Guide

BusinessDate SourceSystem SourceDate Note


2014-01-01 BS 2014-01-01
2014-01-02 BS 2014-01-02
2014-01-03 BS 2014-01-03
2014-01-01 Budget 2013-12-31 The budget would
continue to use 2013-
12-31 until the next
month’s data is
available. Data for the
2013-12-31 stored in
the landing table
Budget.GLBudget would
be used for the 2014-
01-01 business date.
2014-01-02 Budget 2013-12-31
2014-01-03 Budget 2013-12-31

Figure 23 - InsightETL.dbo.SourceDate Example

The system allows you to override the default source dates and control how the dates are created using
the v_System_Parameters InsightETL view. Dbo.s_InsightSource_CSI_Update populates this table with
the appropriate date, or you can update it directly and set the ‘IsDateOverride’ flag for full manual control
of the source date.

Rules Engine
Users can define new rules and modify existing ones using the Data Manager option within the Analytics
application – please refer to the Analytics Front End user Guide for more information on how this is done.
An out-of-the box set of business rules applied to the InsightStaging database will be provided by Temenos.
Users can however create business rules that, during Analytics ETL, will be applied also to tables or views
residing in the InsightSource and InsightImport database.
If the financial institution requests it, Temenos can also enable a functionality that allows to design business
rule also in the InsightLanding database’s tables or views and in the InsightWarehouse’s abstraction views.

Column Splits
The set up of business rules used to split compound columns is currently not possible through the Rules
Engine. Split rules can only be designed using the AttributeCalculation table. Let us see an example to
understand how this kind of configuration works. In the figure below we can see the relevant columns of
a split definition in the AttributeCalculations table. The locally developed split below is used to break up
into three parts the LIMIT @ID, that is a compound primary key consistsing of the values of CUSTOMER
@ID, LIMIT.REFERENCE @ID and a sequence number separated by a dot (‘.’). As defined below, the split
will be applied to the LIMIT table of the InsightSource database.

Page 156 | 335


Advanced Analytics Platform Technical Guide

Figure 24 - Example of Split definition in AttributeCalculations

Like the business rules defined through the Rules Engine, also split rules managed in the
AttributeCalculations table are executed within the Analytics ETL flow. Specifically, split rules are applied
by the s_CreateColumnCalculations stored procedures that, in the out-of-the-box Analytics ETL agent job
provided by Temenos, is run twice, first on the InsightImport database and then on the InsightSource
database within the Insight Attribute Calculations-Import and Insight Attribute Calculations-Source steps,
respectively. The commands within each step are identical with the exception of the first input parameter’s
value, i.e. the database name.

In our example, the split definition indicates that the rule should be appled in the InsightSource database,
so the Limit Id column will be split in InsightSource when the Attribute Calculations-Source step of Analytics
ETL is executed.

Figure 25 - Analytics ETL steps executing column splits and their commands

Once the relevant step of the Analytics ETL is executed, the structure of the target table will be modified
to contain the new columns resulting from the split.

Page 157 | 335


Advanced Analytics Platform Technical Guide

Figure 26 - Example of three more columns added to InsightSource.dbo.LIMIT as a result of the split rule

Maintaining InsightETL
Probably the most common maintenance task is to ensure that the additional mapping defined in InsightETL
is up to date. InsightETL contains additional Rule Definitions definitions that are mapped based on the
content of RuleValues or RuleExpressionLevels.

As new source data comes into Analytics, new Rules values can be created using the Data Manager facilities
in the Analytics web application. Depending on the type of business rule that is defined in Analytics these
new values may create the need to define the additional mapping in Analytics. For a full understanding of
the different business rule types/operations please refer to the section on InsightETL earlier in this
document.

Lookup Rules
The most common business rule’s type that needs to be maintained is Lookup. This is the type of tule that
is used to build custom classifications or hierarchies. One or more source columns is defined as a key for
the target column and every time a unique source column is brought in a new row is created in the view
for that target.

As an example, if we are defining a product or account hierarchy based on a unique product code coming
from the source system then we would use the product code as the source value for the target column and
one or more custom values. In this example, we will define the view for the target column as
ProductClassification. When a new product code is brought in by the ETL process then a new record will
be created in the ProductClassification view for the new source value with N/A as all target column values.

A user will have to replace all N/A values with appropriate mapping values, in this case, the appropriate
product classifications. Follow the example below to see a typical lookup’s target column update where we
are defining the Category and Classification custom columns for the account object.

RuleDefinitions entry (not all columns included):

RuleDefinitionId Table ItemName Operation Target View Name

Page 158 | 335


Advanced Analytics Platform Technical Guide

19 StagingAccount Product- Lookup ProductClassification


ProductClassification

V_ProductClassification before new record (partial):

StagingProductId Foreign Product ProductDesc Category Classification


Key Code
1 19 1001 Current Deposit Chequing
Account
2 19 1002 Current Deposit Chequing
Account with
Overdraft

V_ProductClassification after new record (partial):

StagingProductId Foreign Product ProductDesc Category Classification


Key Code
1 19 1001 Current Deposit Chequing
Account
2 19 1002 Current Deposit Chequing
Account with
Overdraft
3 19 1010 Credit Cards Deposit N/A

When ETL is run, the new values are brought into the warehouse as N/A. At this point, we need to replace
the N/A values with appropriate mapping so the new product can be mapped to the appropriate account
category and classification. This is done using the Data Mappings tab of the Edit Rule screen of the Analytics
web front end.

Page 159 | 335


Advanced Analytics Platform Technical Guide

Figure 27 - Updating Lookup mapping using the Data Mappings tab in the Edit Rule screen

For Lookup rules, the screen above will update the RuleValue table and the associated InsightETL view,
e.g. v_ProductClassification in our case, as shown in the following table.

V_ProductClassification after update by user:

ClassificationId Foreign Key Product ProductDesc Category Classification


Code
1 19 1001 Current Deposit Chequing
Account
2 19 1002 Current Deposit Chequing
Account with
Overdraft
3 19 1010 Credit Cards Deposit Line of Credit

Once the above update has been made and ETL has run again then the N/A values for Category and
Classification for product code 1010 will be replaced with the updated mapping in v_Classification.

Page 160 | 335


Advanced Analytics Platform Technical Guide

This maintenance has to be done for every Lookup business rule in InsightETL as new data is entered.
Take note that two ETL runs have completed before the new source column value has been classified and
the new custom column value has been added to the warehouse.

It is recommended, depending on the Master Data configuration view, for example for v_COA (Charter of
Accounts) and v_InsightHierarchy (internal hierarchy of the COA), that as soon as you update or add a GL
Account / Line in Core Banking or the GL source system, you also add/update the same record in these
two views via Data Manager before the end of the day, hence when the ETL runs the proper hierarchy and
attributes are found and mapped correctly for the new records in the InsightWarehouse, otherwise the new
GL Accounts will have their attributes with default values of ‘N/A’, at this point you will need to change
these values in Data Manager and then reprocess the ETL to reflect the proper mappings for the new GL
Accounts and the existing reports will show accurate data. The Data QA report will also show the number
of ‘N/A’ values that the system currently has so that they can be addressed accordingly.

Banding Rules
While less common, Banding business rules could require an update if the case statement (banding logic)
did not include all source column value possibilities. For example, if there is no record to account for a
source value of NULL then the target column value will be brought in as NULL. The target column will have
a NULL value anytime the source column does not have a statement to account for the source value.

RuleDefinitions entry (not all columns included):

RuleDefinitionId Table ItemName Operation Target View


Name
5 StagingAccount Customer-AgeGroup Banding AgeGroup

Partial example of RuleExpressionLevels table (for AgeGroup-related entries):

RuleExpressionLevels RuleDefinitio BandOrder SQLExpression BandName


Id n
Id
11 5 11 between 19 and 24 19-24
12 5 12 between 25 and 34 25-34
13 5 13 between 35 and 44 35-44
14 5 14 between 45 and 54 45-54
15 5 15 between 55 and 64 55-59
17 5 16 >64 64+
18 5 17 between 0 and 18 <19

In this example, we are not accounting for cases when the source column Age has a NULL value. Anytime
we have a NULL Age then the custom column Age Group will also be NULL. To add a new row for the NULL
age, we should again use the Data Mappings tab of the Edit Rule screen in the Analytics web front end.
We can click on the Add a row button as shown below to include a blank row to the list of buckets associated
with our banding rule, then complete the required fields. Please note that, unlike for Lookups, the screen
below will not affect the RuleValues table this time. Instead, when we edit or add a new group in a banding
rule, the RuleExpressionLevels table will be updated.

Page 161 | 335


Advanced Analytics Platform Technical Guide

Partial example of RuleExpressionLevels table (for AgeGroup-related entries) with NULL


mapping:

RuleExpressionLevel RuleDefinitio BandOrder SQLExpression BandName


s n
Id Id
11 3 1 between 19 and 24 19-24
12 3 2 between 25 and 34 25-34
13 3 3 between 35 and 44 35-44
14 3 4 between 45 and 54 45-54
15 3 5 between 55 and 64 55-59
17 3 6 >64 64+
18 3 9 between 0 and 18 <19
19 3 10 IS NULL No Age

Now, after we have added the new record to RuleExpressionLevels, any time a NULL value is found for Age
then it will use the value No Age for the Age Group custom column. Since all conditions are now being met
by our banding statement, we should no longer have any NULL values for the Age Group target column in
the warehouse.

Using Data Manager


If a user needs to edit master data mapping, it is recommended not to access directly to the InsightETL
database but to use the Data Manager option in the Sytem menu of the Analytics weba pplication. Through
this option we can edit the required tables and views in InsightETL using a simple and intuitive web
interface.

Page 162 | 335


Advanced Analytics Platform Technical Guide

Additionally, this interface can be configured so end users can only edit target columns for Banding rules
that they are intended to edit which allows for detailed control of what master data can and should be
modified by end users.

For a full explanation of using the Data Manager option please consult the Analytics Web Front End User
Guide.

Important Note: Please note that changes to maintenance stored procedures are
not supported.

Page 163 | 335


Advanced Analytics Platform Technical Guide

InsightStaging
Overview
InsightStaging extracts data from InsightSource database tables, transforms it and then loads it into the
InsightWarehouse database. Most of the InsightStaging tables are temporary and recreated for each
Analytics ETL execution.

Specific Features / Functions


Feature Description
Field mapping from Source Source Data mapping into the Fact and Dimension tables is done by means
to Target of InsightStaging v_Source views, and the data dictionary in
InsightWarehouse.
The views also allow the addition of business logic to the source data.
Extraction, Transformation, Fields are renamed and mapped, business rules are applied. For dimension
and Loading of the data into inserts, business rules defined in the Rule Engine are determined. Primary
the InsightWarehouse and Foreign keys are set. Data is inserted into the InsightWarehouse.

Interfaces with Analytics InsightStaging can exchange ETL data with Analytics optional modules e.g.
optional modules Customer Profitability, Predictive Analytics etc. For example, if Customer
Profitability is installed, this module will calculate Customer Monthly Net
Income and other parameters which will be added as columns to the
temporary tables in InsightStaging and finally loaded into the Warehouse.
Table driven orchestration The flow of the Analytics ETL is controlled based on the content in the
of ETL procedures. UpdateOrder table. Analytics ETL components can be easily added, changed
or removed.

Configuration of new source Source systems can be added and enabled or disabled.
systems

Page 164 | 335


Advanced Analytics Platform Technical Guide

Figure 28 - InsightStaging in the core Analytics ETL flow

Page 165 | 335


Advanced Analytics Platform Technical Guide

Technical Details
Architecture
The basic architecture of InsightStaging data flow is depicted in the following diagram which shows the
high-level flow for the Extraction, Transformation, and Loading of a Dim and Fact Combination.

Figure 29 - Analytics ETL data flow in InsightStaging (high-level overview)

Technical Components
InsightStaging consists of the following components.

Tables
dbo.Systems
This table controls which source systems are included in the Analytics ETL process. Using a different source
system allows for a different Analytics ETL configuration per source system, and allows data extraction for
a system to be turned off for a particular Analytics ETL run if required.

ColumnName Description
SourceSystem The name of the SourceSystem, which must match
sourcename schema in extract list. E.g. BS, Budget
etc.

Page 166 | 335


Advanced Analytics Platform Technical Guide

SystemType Defines which type of source system we are dealing


with. E.g. the source system Multi Funds Dealers
Association can have their system type as WM (i.e.
Wealth Management). Null is an acceptable value
SystemModules Deprecated
SystemRank The order in which the system is processed during
Analytics ETL
Enabled_ Defines if data from the source system will be
processed during AnalyticsETL, along with all the ETL
processes that might be associated with it. Acceptable
values are 1 (meaning Yes) or 0 (meaning No).
Configuration Defines what the source system configuration is –
available values are:
- ModelBank: this entry has been added to
satisfy Temenos core banking mapping
and/or business rules
- Local: the entry is used during the
implementation to update or enhance
Framework or ModelBank functionality
- Framework: this entry has been added by
the TFS Framework solution and it is core
banking agnostic
- PBModelBank: Used for Private Banking
record definitions
- CampaignAnalytics: Used for Campaign
Analytics solutions
- Predictive: Used for Predictive Analytics
module when deployed
As all parameterization in this table is source system
agnostic, the default value will be Framework for all
pre-configured source system entries.

dbo.UpdateOrder - Standard
This table controls the execution order of Analytics ETL processes executed during the InsightStaging
update process.

The table has two columns that allow processes to be enabled or disabled, i.e. Enabled_ and Exclude. The
value of the Enabled_ column defines whether the source system in which a certain process belongs is
enabled or disabled in the current Analytics installation. The value of this column is inherited from the value
assigned to the Enabled_ column for its source system entry in the Systems table. Processes will only run
if their associated source system is enabled in this table.

If Enabled_ is set to 1, the Exclude column allows excluding any individual step or a group of steps from
today’s Analytics ETL process but also permits to enable them at another time.

Page 167 | 335


Advanced Analytics Platform Technical Guide

For example, the Insight Pricing update can be turned off or the FactGL could be initially disabled but then
enabled when source data becomes available.

Important Note: the value of the column Enabled_ in Update Order should not be
updated directly, but only by changing the value of the Enabled_ column for the
associated source system entry in the Systems table. The content of the column
Enabled_ in Update Order is regenerated based on the content of the System table so
direct changes will not be preserved after an Analytics ETL run.

Column Name Value


UpdateOrder This is a system generated a unique identifier for each process defined in Update
Order and defines the order in which processes will be executed.
The Update order code is created based on the combination of a series of numeric
codes identifying the ETL phase, the table to be processed etc. It is built as follows:
- The first digit is defined based on the ETL phase number (stored in the
ETLPhaseNum column) and cannot be updated.
- The second, third and fourth digits depend on the table updated (stored in
the TableSeqNum column) to be loaded and cannot be updated either.
- The middle three digits can be used to differentiate between different source
views (stored in the SourceViewNum column).
- The last two digits identify the substep defined (identified by the code stored
in the SubstepNum column).
Configuration Defines what the process configuration is – available values are:
- ModelBank: this entry has been added to satisfy Temenos core banking
mapping and/or business rules
- Local: the entry is used during the implementation to update or enhance
Framework or ModelBank functionality
- Framework: this entry has been added by the TFS Framework solution and
it is core banking agnostic
- PBModelBank: Used for Private Banking record definitions
- CampaignAnalytics: Used for Campaign Analytics solutions
- Predictive: Used for Predictive Analytics solution when deployed
TableName The name of the table being populated, e.g. DimAccount.
SourceSystem The name of the source system from which source data is extracted, e.g. ‘BS’ or
‘Budget’. If the data being processed is produced during Analytics ETL and source
system agnostic (or relying on multiple source systems), ‘Framework’ value will be
used in this column. Null values are also acceptable.
TableType Set to the suffix of the v_source view that is being linked to by this record. e.g. BS
or BSDMD.
So for example, if DimAccount was being populated and the TableType was BSDMD
the view being called by this record would be v_SourceAccountBSDMD.
Action_ Action performed by the process defined. Acceptable values are:

Page 168 | 335


Advanced Analytics Platform Technical Guide

- Extract
- Extract.substep
- Merge.substep
- Add.substep
- Transform
- Transform.Substep
- Load
Enabled_ Defines if the source system in which a process belongs is enabled. Acceptable values
are 1 (for Yes) or 0 (for No) and they are dependent on the Systems table.
Exclude Configure this column to exclude the current process from Analytics ETL. If set to 0,
the process is included and will be executed in the next ETL run. If set to 1, the
process is excluded and will be ignored in the next ETL run.
ETLPhaseNum One digit code for the ETL Phase of the currently defined process. Acceptable values
are 1 to 7. It defines the first digit of theUpdateOrder id column
TableSeqNum Three digits numeric code identifying the table being processed by the currently
defined process. It defines the second, third and fourth digits of theUpdateOrder id
column
SourceViewNum Three digits numeric code identifying the v_source view being used by the currently
defined process. It defines the fifth, sixth and seventh digits of theUpdateOrder id
column
SubStepNum Three digits numeric code identifying the substep in the currently defined process. It
defines the last two digits (eight and ninth digits) of theUpdateOrder id column

dbo.UpdateOrder - Execute
There are some rare circumstances that require custom executables to be run within the ETL. This
functionally has been used to call the Canadian HouseHolding (Address parsing) module and for Cost
Allocation processing.

Column Name Value


UpdateOrder This is a system generated a unique identifier for each process defined in Update
Order and defines the order in which processes will be executed.
The Update order code is created based on the combination of a series of numeric
codes identifying the ETL phase, the table to be processed etc. It is built as follows:
- The first digit is defined based on the ETL phase number (stored in the
ETLPhaseNum column) and cannot be updated.
- The second, third and fourth digits depend on the table updated (stored in
the TableSeqNum column) to be loaded and cannot be updated either.
- The middle three digits can be used to differentiate between different source
views (stored in the SourceViewNum column).

Page 169 | 335


Advanced Analytics Platform Technical Guide

- The last two digits identify the substep defined (identified by the code stored
in the SubstepNum column).
Configuration Defines what the process configuration is – available values are:
- ModelBank: this entry has been added to satisfy Temenos core banking
mapping and/or business rules
- Local: the entry is used during the implementation to update or enhance
Framework or ModelBank functionality
- Framework: this entry has been added by the TFS Framework solution and
it is core banking agnostic
- PBModelBank: Used for Private Banking record definitions
- CampaignAnalytics: Used for Campaign Analytics solutions
- Predictive: Used for Predictive Analytics solution when deployed
TableName The name of the table being populated, e.g. DimAccount.
SourceSystem The name of the source system from which source data is extracted, e.g. ‘BS’ or
‘Budget’. If the data being processed is produced during Analytics ETL and source
system agnostic (or relying on multiple source systems), ‘Framework’ value will be
used in this column. Null values are also acceptable.
TableType Enter the stored procedure name and any parameters e.g.
insighthouseholding.dbo.s_insighthouseholding_update 'Insight', '0'
Action_ Extract.Execute or Transform.Execute or Load.Execute
Enabled_ Defines if the source system in which a process belongs is enabled. Acceptable values
are 1 (for Yes) or 0 (for No) and they are dependent on the Systems table.
Exclude Configure this column to exclude the current process from Analytics ETL. If set to 0,
the process is included and will be executed in the next ETL run. If set to 1, the
process is excluded and will be ignored in the next ETL run.
ETLPhaseNum One digit code for the ETL Phase of the currently defined process. Acceptable values
are 1 to 7. It defines the first digit of theUpdateOrder id column
TableSeqNum Three digits numeric code identifying the table being processed by the currently
defined process. It defines the second, third and fourth digits of theUpdateOrder id
column
SourceViewNum Three digits numeric code identifying the v_source view being used by the currently
defined process. It defines the fifth, sixth and seventh digits of theUpdateOrder id
column
SubStepNum Three digits numeric code identifying the substep in the currently defined process. It
defines the last two digits (eight and ninth digits) of theUpdateOrder id column

dbo.SourceDate
This table shows the current business date that is being processed. This date is used as the snapshot date
for the fact tables.
Column Name Description
SourceDateId Record Id (identity column). Populated automatically.

Page 170 | 335


Advanced Analytics Platform Technical Guide

BusinessDate Current system business date in a YYYY-MM-DD format

A copy of the SourceDate table can also be created to show the current business date for different source
system. The name of these source system-specific tables will follow the syntax SourceDate<Source System
Name> e.g. SourceDateBS and they will have the structure below.
Column Name Description
SourceDateId Record Id from the Source Date table

BusinessDate Current system business date from the SourceDate table

<Source System Name> Current business date for this specific source system
BusinessDate
(e.g. BSBusinessDate)
<Source System Name> End of Month date for this specific source system
EndOfMonth
(e.g. BSEndOfMonth)

dbo.SystemParameters
This table is a Data Manager Custom Table rule that where system parameters that are specific to the
InsightStaging database are defined. As discussed in the Rules Engine section, the types and values for a
given system parameters are mapped. When the rule creation steps are run by an agent job, the
s_CreateRuleGroup procedure creates this table, hence all columns in this table are populated
automatically.

Column Name Description


SystemParametersLandingId Record Id (identity column).
ForeignKey Id of the associated Custom Table rule.
Value Value of the parameter to be defined by the business rule. E.g.
C:\Program Files\Microsoft SQL
Server\MSSQL13.MSSQLSERVER\MSSQL\DATA
Name Name of the parameter to be defined by the business rule. E.g.
SQL Data File Path
Type Type of process considered. E.g. Partitioning.
Source Source of the parameter defined, if any.

UpdateLog (deprecated)
This log was updated by the ETL stored procedures in releases before R15. It tracked execution times for
extract, transform and load stages, rows processed, type1 or 2 changes, the number of updates and error
messages for each dimension, fact and bridge table. This log was used to review record counts, execution
times or ETL run progress.
Column Name Description
Batch Order by the Batch Desc to get the latest load listed first.

TableName Name of the Warehouse table eg. DimAccount

Page 171 | 335


Advanced Analytics Platform Technical Guide

BatchStart Date/Time the ETL started


TableStart Time the table started to process
ExtractStart Time the extract phase started
RowsExtracted The number of rows extracted from the InsightSource tables.
ExtractFinish Time the extract phase completed.
TransformStart Time the transform phase started
RowsTransformed The number of rows transformed from the InsightSource tables.
TransformFinish Time the extract phase completed.
LoadStart Time the load phase started
NewRows The number of rows extracted from the InsightSource tables.
NewRowsTime Time the load phase completed.
Type1Changes Number of Type 1 dimension change
Type1ChangesTime Time of the Type 1 changes
Type2Existing Number of Type 2 changes
Type2ExistingTime Time of the Type 2 changes
Type2Inserts The number of type 2 record inserts, records should only be inserted
when there is a change to a customer record.
Type2InsertsTime Time of the Type 2 inserts
CaseChanges Number of case changes
CaseChangesTime Time of the case changes
LoadFinish Time the Load phase completed
TableFinish Time the table completed
BatchFinish Date/Time the ETL completed
ExtractSeconds Elapsed processing time of extract phase
TransformSeconds Elapsed time of transform
LoadSeconds Elapsed time of load
NumberUpdates Number of dimension updates
LastErrorMessage Last error message logged
WarningMessage Last warning message logged

UpdateLogDetails (deprecated)
This was a transactional style log that was updated by the ETL stored procedures, in releases before R15.
Each procedure would write to the UpdateLogDetails at the beginning of the procedure and before any
major steps within the procedure. This log was used to investigate process interruptions or failures, or
specific procedure execution times.

Column Name Description


UpdateLogDetailsId System generated id
UpdateLogId Foreign Key to Update Log
Date_ The date/time of the step
Action_ Extract/ Transform/ Load
Query In most cases the SQL query or stored procedure
that was running for the particular step.
Rows Number of rows returned
Details A description of the ETL step
For example the following query:
SELECT [UpdateLogDetailsId]
,[UpdateLogId]
,[Date_]
,[Action_]
Page 172 | 335
Advanced Analytics Platform Technical Guide

,[Query]
,[Rows]
,[Details]
FROM [InsightStaging].[dbo].[UpdateLogDetails]
where left(query,2) = 's_'
order by updatelogdetailsid desc

Can be used to find the last stored procedures that ran before a load failure, an administrator could scan
the list and find for example s_DimAccount_Transform @UpdateLogId=38, he or she could then run Exec
s_DimAccount_Transform @UpdateLogId=38 rather than the entire ETL while tracking down the issue.

v_Source Views
At least one v_Source view should exist for each Dimension and Fact combination. Each v_source view
should at least have the following fields.

The v_source view should be named as follows.

V_source + table name + SourceName + Sub System Abbreviation. Eg v_sourceAccountBSXY.

The BS code above corresponds to the Core Banking Source System in the UpdateOrder table. BSXY
corresponds to table type in the UpdateOrder table.

The structure of each v_source view can vary greatly depending on the type of data to be mapped and on
the source system, however, all v_source views should contain the following:

FieldName Description
Source{tablename}Id Eg. SourceAccountId, this the primary key of the
view and should be unique within the view.

BusinessDate The snapshot date of the data.

Foreign Key References Natural foreign keys so that surrogate foreign keys
can be populated. Eg. SourceCustomerID,
SourceEmployeeID.

System Views
A number of views are required for systems purposes.

v_sourceDate{System}
Each System that is used needs to have this view eg. v_SourceDateBS.

The view returns the current date of the ETL. It should return only one record. ETL will not run for the
system if these dates do not match the dates of the v_souce views.

Field Description

Page 173 | 335


Advanced Analytics Platform Technical Guide

SourceDateID For the extraction step

BusinessDate The latest date of the main banking system data source. (Typically).

{System}BusinessDate The date the Source system will be inserted in the


InsightWarehouse for.

v_sourceDate
This view returns the current snapshot date. Each v_source view should cross join to this table.

v_MetaData
This is a system view that should not need any configuring or review.

Stored Procedures
dbo.s_InsightStaging_Update (Extract)
Description
This stored procedure represents the main stage of the Analytics ETL job which uses the Source Views to
transform the Temenos Core Banking data structure into the Warehouse Dimensional Model.

Steps
 Drop All Tables with the prefixes
o Source
o Staging
o Dim
o Fact
o Bridge
 Checks that the particular system that the v_source view being processed belongs to is enabled.
InsightStaging..Systems.
 Check the BusinessDate of InsightMasterData.dbo.CurrentDate & dbo.SourceDate against
BusinessDate of InsightSource.BS.SourceDate. Uses v_sourceDate{System} to get the correct date.
o Error if no match
 Executes Extract Step for each table
 Execute Stored procedure s_{tablename}_extract
 Populates (if applicable) the log tables and executes the corresponding stored procs as indicated in
the UpdateOrder table.
 Checks to see if previous updates were successful before continuing.

Inputs
 @ExecuteExtractSteps – defines if the Extract step should be executed. It should be set to 1 when
Extract phase has to be performed.

Page 174 | 335


Advanced Analytics Platform Technical Guide

 @ExecuteTransformLoadSteps – defines if the Transform and Load phases should be executed.


We can set this parameter to 0 if we wish not to run these two phases together with the Extract phase
(e.g. because we want Transform and Load to be executed separately). Alternatively, both this and the
previous input parameter can be set to 1 to execute Extract, Transform and Load at the same time –
this is not recommended however as it makes troubleshooting and debugging more difficult.

 @BatchNum – Batch number used to execute the stored procedure. This parameter introduces ETL
Batch control in the stored procedure and can be obtained from a parameter or indirectly from
dbo.Batch table through the function

 @TotalThreads – allows to manually assign the number of threads used to execute the stored
procedure (accepts NULLs). Reserved for future development in this stored procedure.

dbo.s_InsightStaging_update (Transform)
Description
This stored procedure executes the ‘Transform’ and ‘Load’ steps defined in the UpdateOrder table

Steps
 EXECUTE procedures
o s_Fact{tableName}_Transform
o s_Fact{tableName}_Load
o s_Dim{tableName}_Transform
o s_Dim{tableName}_Load

Inputs
 @ExecuteExtractSteps – defines if the Extract step should be executed. It should be set to 0 if the
Extract Phase has already been performed.

 @ExecuteTransformLoadSteps – defines if the Transform and Load phases should be executed.


We can set this parameter to 1 if we wish them to run after the Transform phase has successfully
completed.

 @BatchNum – as previously described

 @TotalThreads – as previously described

dbo. s_Generic_Extract
Description
This is a generic stored procedure internally called by the s_DimXXX_Extract and s_BridgeXXX_Extract
stored procedures.

Page 175 | 335


Advanced Analytics Platform Technical Guide

Steps
 Drops and recreate the relevant DimXXX table based on the InsightWarehouse data
dictionary definition
 Drops and recreate the relevant FactXXX table based on the InsightWarehouse data
dictionary definition
 Creates the StagingXXX table based on the content of the BridgeXXX and SourceXXX tables
 Inserts and merges records into the StagingXXX table
 Applies business rules assigned to the Extract phase to the StagingXXX table internally calling
the InsightETL.dbo.s_CreateRuleGroup
 Updates Logs

Inputs
 @StagingEventLogId - Foreign key pointing to dbo.StagingEventLog
 @ProcId - The calling procedure's object identifier
 @SourceMD – must be set to 1 to do the manual entry (MD) steps for the sourceXXX table if it's
true
 @Transpose – must be set to 1 to transpose if it's true
 @RecreateFromDW – must be set to 1 to recreate the DimXXX or BridgeXXX table based on the
[$(InsightWarehouse)].dbo.Dim(/Bridge)XXX table
 @RecreateFactFromDW – By default set to false. Whether to recreate the FactXXX table based
on the one in [$(InsightWarehouse)] if it's true
 @CreateStagingTable bit - To create the staging table from the corresponding Dim(/Bridge)XXX
and SourceXXX tables
 @InsertMergeStagingTable - When true, the procedure inserts/merges the records into the
StagingXXX table
 @StagingMD – Used to do the manual entry steps for the staging table if set to true

dbo. s_Generic_Transform
Description
This is a generic stored procedure internally called by the s_DimXXX_Transform and
s_BridgeXXX_Transform stored procedures.

Steps
 Applies business rules assigned to the Transform phase to the StagingXXX table internally
calling the InsightETL.dbo.s_CreateRuleGroup
 Inserts and merges the resulting columns into the StagingXXX table
 Populates the DimXXX, FactXXX and BridgeXXX tables
 Updates Logs

Inputs
@StagingEventLogId – Id of the corresponding StagingEventLog entry

@ProcId - Id of the calling stored procedure in order to figure out the table name from its name

@SqlBeforeMDStaging – Optional legacy parameter to be left null

Page 176 | 335


Advanced Analytics Platform Technical Guide

@CommentsForSqlBeforeMDStaging – Optional legacy parameter to be left null

@PricingUpdate - Optional parameter not needed except for DimAccount; it should be set to true (1)
only when the DimAccount table is processed and the optional Customer Profitability module is installed

@MDStaging - If true, the pre-defined InsightETL business rules are applied against the StagingXXX table

@SqlBeforeInsert - Optional parameter not needed except for DimAccount. If specified, addtional
statements will be carried out right before the @SqlTransform (INSERT INTO)

@CommentsForSqlBeforeInsert - the comments for above statement. For logging purpose only

@SqlInsert nvarchar(max) - Specific INSERT INTO statement for transform, including 3 place holders:
*TABLE*, *ALL*, *ALLDWS*

@AddPK - If true, it will try to add primary key. Avoid this for staging FactXXX because usually such costy
operation has no reasonable usage.

@CustomPKColumn - Optional parameter not needed except for DimDate. If specified and @AddPK =
1, the designated column will be used as the primary key; otherwise, the default column having pattern
'Source*Id' will be used as the PK

@SqlPostInserted - Optional parameter not needed except for DimIndividual. If specified, additional
statements will be carried out right after the primary key is created

@CommentsForSqlPostInserted - The comments for above statement. For logging purpose only

dbo.s_Generic_Load
Description
This is a generic stored procedure internally called by the s_DimXXX_Load, s_FactXXX_Load and
s_BridgeXXX_Load stored procedures.

Steps
 Loads the DimXXX, FactXXX and BridgeXXX tables into the InsightWarehouse
 Updates Logs

Inputs
 @StagingEventLogId – Id of the corresponding StagingEventLog entry
 @ProcId - Id of the calling stored procedure in order to figure out the table name from its name

For Facts only

 @TypeOfStandardWhereClause - Indicates whether the fact loading process is based on the


BusinessDate or the existence of the business key (Source*Id) 15. Acceptable values are:

15
In Model bank, only seven stored procedures are based on the existence of business key i.e. 1).
FactAcctTran_Load; 2). FactActivity_Load; 3). FactBrokerTrade_Load; 4). FactEvent_Load; 5).
FactGlAdjmt_Load; 6). FactGlTran_Load; 7). FactOrder_Load. All other procedures are loading facts
exclusively based on BusinessDate

Page 177 | 335


Advanced Analytics Platform Technical Guide

o 0 or negative: Non-standard, use @CustomWhereClause instead


o 1: Loads facts based on the matching [BusinessDate];
o 2: Loads facts based on the existence of corresponding business key (e.g.,
[SourceAccountId]) -- only the rows with NEW SourceAccountId values will be loaded)
 @CustomWhereClause - Custom T-SQL WHERE clause. For FactAcctTranSub_Load only
 @IndexingBusinessKey - If set to 1 (true), will try to index the business key as nonclustered
before executing the INSERT INTO statement in s_Load_Fact.

dbo.s_Dim{tablename}_Extract
Description
This kind of stored procedure triggers a number of Stored Procedures as configured by the UpdateOrder
table under the Action_ field. This SP is named with the Dim prefix although it takes care of the source
views into source tables which are not defined as Fact or Dim, it also creates both the Fact and the dim
tables in InsightStaging from the structure defined in InsightWarehouse.

Steps
 The procedure will firstly query the Update Order table and Executes procedures as follows based on
the records in UpdateOrder.
 Then, the procedure loops through the update order table and executes various stored procedures to
load the dimension.
 UpdateOrder.Action_ = Extract’ and
 UpdateOrder.Action_ = Extract.Substep
 EXECUTE s_Extract_SourceTable
 This will extract the v_source{TableName}{TableType} view into
InsightStaging..Source{TableName}{TableType}, where tabletype is the column in UpdateOrder.
For example, the Update Order record partially described in the table below will result in the
v_sourceEmployeeBSDAO view being inserted into the table SourceEmployeeBSAA.

UpdateOrderI Configuratio TableName SourceSyste TableTyp Action_ Enabled Exclud


D n m e _ e
10310101 ModelBank DimEmploye BS BSDAO Extract.subste 1 0
e p

 Then the procedure drops and creates Dim/Fact Tables


o EXECUTE s_Transform_CreateTableFromDW
 Runs for both Dim and Fact tables
 Create Staging table based on both Dim & source tables
o EXECUTE s_Transform_CreateStagingTable
o EXECUTE s_sourceTable_OutOfRangeUpdate
 Checks for data type conflicts between source and staging table
 UpdateOrder.Action_ = Extract’ and

Page 178 | 335


Advanced Analytics Platform Technical Guide

 UpdateOrder.Action_ = Add.Substep or Merge.Substep


 Add.Substep, adds the materialised source{tableName}{tabletype} to Staging{TableName}
o EXECUTE s_StagingTable_update
 Merge.Substep
o EXECUTE s_StagingTable_Merge
 Add Custom Columns to staging tables
o EXECUTE s_transform_MD

Inputs:

StagingEventLogID – Id for the logging table

dbo.s_Dim{tablename}_Transform
Descritiption:
Updates Staging Table with custom columns with the execution phase of ‘Transform’ and inserts all data
from staging{tableName} into Dim{tableName} within InsightStaging

Steps
 Runs for Transform entries of custom column relating to the particular table being processed.
 EXECUTE s_transform_MD
 Creates Insert taking the fields from sys.Columns records for the Dim{TableName}
 Inserts all data from Staging Table into Dim{tablename} table of InsightStaging

Inputs:
 StagingEventLogID – as previously defined

dbo.s_Dim{tablename}_Load
Description:
Simply runs the load Dimension Stored procedure

Steps
 EXECUTE s_Load_Dimension

Inputs:
 StagingEventLogID – as previously defined

dbo.s_Load_Dimension
Description
Loads data into the final InsightWarehouse tables

Steps
 Get Columns from Data Dictionary
 Load New Rows
o Look for new rows in staging Dim table

Page 179 | 335


Advanced Analytics Platform Technical Guide

 New rows identified by looking for a NULL primary key


o Insert rows into InsightWarehouse tables
 For each SCD type run relevant queries
 SCD Type 1
o Replaces existing rows in Dim table where a change has occurred to the Type 1 row as
defined in the data dictionary
 SCD Type 2
o Adds a new row in Dim table where a change has occurred to the Type 2 row as defined in
the DD
o New rows are set ‘Active’ 1, old rows are set to 0 in the Dim{tablename}.

Inputs:
 StagingEventLogID – as previously defined

 ChangeCase

dbo.s_Fact{tablename}_Extract
Description
Nothing is done here other than logging an extract event for the fact table. The table is extracted in the
Dimension extract.

Steps
Update log details

Inputs:
 StagingEventLogID – as previously defined

dbo.s_Fact{tablename}_Transform
Description
Updates Staging Table with custom columns with the execution phase of ‘Transform’ and inserts all data
from staging{tableName} into Fact{tableName} within InsightStaging

Steps
 Runs for Transform entries of custom column relating to the particular table being processed.
 EXECUTE s_transform_MD
 Creates Insert via hard coded joins and takes the fields from sys.Columns records for the
Dim{TableName}

NB: These JOIN statements can be amended if for example an extra ID was needed for a Dim
table, it may need a new join to be able to show that Id in the table.

 Inserts all data from Staging Table into Dim table of InsightStaging

Page 180 | 335


Advanced Analytics Platform Technical Guide

Inputs
 StagingEventLogID – as previously defined

dbo.s_Fact{tablename}_Load
Description
Runs the load Dimension Stored procedure

Steps
 Delete data from Warehouse Fact table for current business date
o This prevents the reprocessing of a batch from causing duplicates
 EXECUTE s_Load_Fact
o To load the Warehouse tables

Inputs
 StagingEventLogID – as previously defined

dbo.s_Load_Fact
Description
Loads data into the final InsightWarehouse tables

Steps
 Get Columns from Data Dictionary
 Load New Rows for Latest Business Date from staging Fact to Warehouse Fact

Inputs:
 StagingEventLogID – as previously defined

 ChangeCase

dbo.s_Extract_SourceTable
Description
Drops and creates the Source table from the source view.

Steps
 Checks if table exists and drops if needed
 Check for duplicates using PK passed as input variable
 Create Primary Key on Source Table

Inputs
 StagingEventLogID – as previously defined

 TableName – name of the table to be processed

Page 181 | 335


Advanced Analytics Platform Technical Guide

 SourceTableName – name of the source table

 SourcePK – primary key on source table

dbo.s_Transform_CreateTableFromDW
Description
Recreates the Dim/Fact tables from the InsightWarehouse tables

Steps
 Drop InsightStaging Dim/Fact table if exists
 Create table from the InsightWarehouse equivalent
 Add Primary Key to the field titled {tablename}ID. AccountID on FactAccTran is the exception to this.
 Drop Computed Columns

Inputs
 Table Name – name of the table to be created

 StagingEventLogID – as previously defined

 Action – input parameter for logging purpose only, should be left null

dbo.s_Transform_CreateStagingTable
Description
Creates Staging Tables within InsightStaging

Steps
 Drop InsightStaging, staging{tablename}, if exists
 Retreive all Columns from the equivelant source{tablename}
 Retreive all Columns from the equivelant Dim/Fact/Bridge{tablename}
 Combine all field lists to Create staging{tableName}

Inputs
 Staging Table – name of the table to be created

 StagingEventLogID – as previously defined

 Action – input parameter for logging purpose only, should be left null

dbo.s_SourceTable_OutOfRangeUpdate
Description
Checks for datatype/size conflicts between source and target table, and reports in the logging table.
Page 182 | 335
Advanced Analytics Platform Technical Guide

Inputs
 Source Table – name of the source table to be checked

 Target Table – name of the target table to be checked

 StagingEventLogID – as previously defined

dbo.s_StagingTable_update
Description:
This procedure populates Staging Table with new records from the Source Table. It inserts only new data
into the Staging{tableName} from Source{tableName} where the source{tableName}ID is not already in
the Staging table

Inputs
 Staging Table – name of the staging table to be populated

 System – source system name e.g. BS

 Where Clause – any filter applied to source data

 StagingEventLogID – as previously defined

dbo.s_StagingTable_merge
Description
This procedure updates the staging table with latest versions of records. It updates records in
Staging{tableName} from Source{tableName} where the source{tableName}ID of the source table
matches the staging table source{tableName}ID.

Inputs
 Staging Table – name of the staging table to be populated

 System – source system name e.g. BS

 Where Clause – any filter applied to source data

 StagingEventLogID – as previously defined

dbo.s_Transform_CreateTableFromSS
Description
Caters for the creation of Custom Tables business rules

Page 183 | 335


Advanced Analytics Platform Technical Guide

Steps
 Drops Target Table table if existing (TargetTable = CustomColumn field from
InsightMastData.CustomColumn)
 Creates target table from Source Table using a filter. See below for actual field from
InsightMasterData.CustomColumn.
o Target Table = CustomColumn
o Source Table = SourceColumn
o Filter = SourceDataFilter

Inputs
 Source Table Name – name of the source table

 Target Table Name – name of the target table to be created

 Where – any filter applied to source data

 StagingEventLogID – as previously defined

dbo.s_CustomCode_Execute
Description
Runs the custom code defined in a InsightETL Dataset business rule. i.e “update table set field = value”.
See InsightETL chapter for details.

Steps
 Creates a user with permissions to insert, delete, update, alter on the dim/fact/bridge/staging/source
tables of InsightStaging
 EXECUTE the custom code passed (taken from the InsightETL.dbo.RuleDefinition table for Dataset
records)

Inputs:
 CustomCode – text storing the custom code to be executed

dbo.s_Bridge{tablename1}{tablename2}_Extract
Same structure as the stored procedure to extract a fact table

dbo.s_Bridge{tablename1}{tablename2}_Transform
Same structure as the stored procedure to transform a fact table

dbo.s_Bridge{tablename1}{tablename2}_Load
Same structure as the stored procedure to load a fact table

Page 184 | 335


Advanced Analytics Platform Technical Guide

dbo.s_EventLog_Add
Description
This stored procedure adds new rows to the EventLog table.

dbo.s_EventLogDetails_Add
Description
This stored procedure adds new rows to the EventLogDetails table.

dbo.s_transform_MD (deprecated)
Description
In Pre-R18 releases, this stored procedure (in the InsightStaging database) updated staging tables and fact
tables based on the MasterData stored in the CustomColumn and CustomValue tables. Specifically, it
created custom Column fields in staging tables and staging FactTables.

Steps
For Each Custom Column record, execute the relevant Stored Procedure depending on Custom column
Type.

Type 0

 Add Column(s) to staging table


 Update New Column with values from the CustomValue table.
o Used for value mapping, using input/source columns with values which then map to output
column values.

Type 1

 Add Column to staging table


 Update New Column with values from the CustomValue table using case statements to create
buckets.

Type 3

 EXECUTE s_CustomCode_Execute

Type 2

 Add Column to staging table


 Update New Column to Source Column value
o Used for applying aliases to fields

Inputs:
 TableName – Name of the InsightStaging table to be updated (e.g. StagingAccount)
 ETL Phase – ETL phase in which the update should take place (i.e..Extract or Transform)
e.g.: Exec s_transform_MD ‘StagingAccount’, ‘Extract’

Page 185 | 335


Advanced Analytics Platform Technical Guide

Configuration
The items that require configuration are:

 UpdateOrder table that controls the Analytics ETL flow.


 v_SourceViews
 Systems table

Creating v_Source Views


Each dimension and fact table is populated by at least one v_Source view.

At least one view needs to exist for each dimension and fact table, at least one view populates both the
dimension and fact table.

For instance, if FactAccount and DimAccount need to be populated then a view called v_SourceAccountXX
needs to be created, where XX signifies the name of the source system from which Account data is
extracted. So in many cases, the v_SourceAccountBS view will populate account records from the banking
system (BS) into DimAccount and FactAccount.

Fact and dimension tables for the same object, in this example DimAccount and FactAccount, can be
populated by different sub-systems (e.g. Core banking modules) in the same source system.

If for example, some account records in a specific Analytics installation come from the main account module
in Core banking, some come from the MM (Money Market) module and others come from the LD module
(Loan and Deposits), then three v_Source views need to be created and they would be called
v_SourceAccountBS, v_SourceAccountBSMM, and v_SourceAccountBSLD. It should be noted that unions
should never be done inside a v_Source view.

V_source views for second level subsystems can also exist – e.g. the Core banking module called AA
(Arrangement Architecture) is structured into AA Accounts, AA Deposits, AA Lending etc. Therefore, if one
or more of the AA module subproducts are enabled in the current Analytics installation, the corresponding
v_source views need to be created e.g. v_SourceAccountBSAA_Accounts,
v_SourceAccountBSAA_Deposits, v_SourceAccountBSAA_Lending etc.

Similarly for another source system that is going to populate DimAccount and FactAccount, say the Budget
SourceSystem, a v_source view called v_SourceAccountBudget would be created.

Each v_SourceView should contain a natural key field called Source{TableName}ID, where {TableName}
would be Account in the example above with DimAccount and FactAccount. This key must be unique for
all of the views for a particular staging table such as stagingAccount in this example.

Adding a v_SourceView
For example A new wealth management subsystem (Abbreviation = WM) has been added to the banking
system (SourceSystem = BS), account records from this subsystem need to be added to DimAccount and
FactAccount.

Step 1

Page 186 | 335


Advanced Analytics Platform Technical Guide

Ensure tables required are available in InsightSource. If not configure InsightImport (storing data extracted
from Temenos Core banking only), and InsightLanding appropriately so that the table is transferred to
InsightSource. See InsightImport Configuration and InsightLanding Configuration sections for more details.

Step 2

Create v_source view. The v_source view should be named as follows:

V_source + table name + SourceName + Subsystem Abbreviation.

E.g, v_sourceAccountBSWM.

The v_source view should have the following fields:

ColumnName Example Description


SourceAccountID Branch_Co_Mne+’:’+[@ID] Mandatory:
A combination of fields to uniquely
identify the account.
BusinessDate 2014-01-31 Mandatory:
The snapshot date of the extract. This
date is obtained by cross joining to the
insightStaging.v_SourceDate table.
SourceCustomerID Customer.[@ID] Optional.
A combination of fields that returns the
same values as SourceCustomerID in
the v_SourceCustomerBS view.
SourceEmployeeID Employee.[@ID] Optional.
A combination of fields that returns the
same values as SourceEmployeeID in
the v_SourceEmployeeBS view.
Any business field
Other foreign key references and Optional:
other fields

For example the v_SourceView could look as follows:

Create view v_SourceAccountBSWM as

Select
D.BusinessDate
,A.Branch_Co_Mne + ':'+ WM.[@Id] as SourceAccountID
,B.[@ID] as SourceCustomerID
,A.EmpID as SourceEmployeeID
,A.Amount as Balance

Page 187 | 335


Advanced Analytics Platform Technical Guide

From
InsightSource.BS.tableWM WM

Left Join InsightSource.BS.tableAB AB

on WM.ID = AB.ID
Cross Join v_SourceDate D

Step 3

Add new entries to update order table so that v_source view data is added to the appropriate Dimension
and Fact tables. See Update Order table description.

Update Order
The UpdateOrder table controls which dimensions and facts are processed, and which v_Source views are
used to populate them.

Each dimension and fact have corresponding entries in the UpdateOrder table which controls the Extract,
Transform and Load processes.

Turning on Dimensions and Fact Tables


In order to have a v_source view populate a dimension or fact table at least three records need to be
present in the UpdateOrder table.

We can see a partial sample of the Update order entries used to populate DimAccount in the table below:

Update Configurati TableName Source Table Action_ Enabled Exclude


Order on System Type _
10600000 Framework DimAccount Extract 1 0
10610101 Configuration DimAccount BS BSWM Extract.Subste 1 0
p
10610104 Configuration DimAccount BS BSWM Add.Substep 1 0
10610201 Configuration DimAccount Budget Budget Extract.Subste 1 0
p
10610204 Configuration DimAccount Budget Budget Add.Substep 1 0

During the Extract Process, the above records will result in the following tables being created:

InsightStaging.SourceAccountBSWM

InsightStaging.SourceAccountBudget

These two tables above will then be inserted into a single StagingAccountTable.

Page 188 | 335


Advanced Analytics Platform Technical Guide

Dimension Transformation and Load processes are switched on as shown in the partial representation of
Update order entries in the following table:

(Nothing needs to be done if this is an existing dimension being loaded by other v_source views)

UpdateOrder Configuration TableName SourceSystem TableType Action_ Enabled_ Exclude


21100000 Framework DimAccount NULL Dimension Transform 1 0
21200000 Framework DimAccount NULL Dimension Load 1 0

The Fact Table Transformation and Load processes are switched on as shown in the table below (again,
not all columns are shown):

(Nothing needs to be done if this is an existing Fact table being loaded by other v_source views)

UpdateOr Configurat TableNa SourceSyst TableTy Action Enable Exclude


der ion me em pe _ d_
13900000 Framework FactAccou NULL Periodic Extract 1 0
nt Snapshot
Fact
Periodic
FactAccou Snapshot Transfo
27800000 Framework nt NULL Fact rm 1 0
Periodic
FactAccou Snapshot
1000000 Framework nt NULL Fact Load 1 0

Fields from the v_source view are added to either the Fact or Dimension table depending on their
description in the InsightWarehouse..DataDictionary table.

Adding Systems
New systems need to be enabled in InsightStaging..Systems. Ensure that Enable_ is set to 1 for all active
source systems.

The source system name should match the SourceName in InsightLanding..ExtractList.

The table below shows a partial example of enabled BS and Budget source systems in the Systems table

SourceSystem SystemType SystemModules SystemRank Enabled_ Exclude


BS BS NULL 1 1 0
Budget NULL 11 1 0

Adding Tables from a new Source System

Page 189 | 335


Advanced Analytics Platform Technical Guide

We will now discuss how to add tables from a new source system.
Let us assume, for example, that a new Source system which contains customer account data is required
to be added to the Insight Data Warehouse. The data is at the same level of granularity as the Banking
System. The data is also snapshot type data. The system abbreviation will be “INS”.
Preparation
You should prepare Extracts from Source System. These extracts should be a snapshot of Customer and
Account details and any incremental transactions.

Then you should map the extracted data to existing InsightWarehouse objects (using InsightStaging
v_source views and InsightWarehouse Data Dictionary table).

InsightLanding
Assuming the source system has two tables – Customer and Account.
Update Landing ExtractList and ExtractSourceDate tables with new data source tables and date.

ExtractList
The table below shows a partial example of Extract list configuration to add tables from a new source
system called INS.

Column Name Record1 Record2


ExtractListId 1 2
SourceName INS INS
SourceServer
SourceDB INSData INSData
SourceSchema Dbo Dbo
SourceTable Account Customer
TargetTable NULL NULL
ImportFlag 1 1
ImportFields * *
ImportOrder 1 2
WhereClause NULL NULL
UserId Dbo Dbo
PurgeOlderThan NULL NULL
CombinedTranDays NULL NULL
PrimaryIndexColumns NULL NULL

ExtractSourceDate
The table below shows a partial example of Extract Source Date configuration to add tables from a new
source system called INS.

Column Name Record1 Description


SourceName INS Name of the Source System

Page 190 | 335


Advanced Analytics Platform Technical Guide

BSDateSQL select @bsdate = MISDate from T-SQL statement which extract the
INSData.dbo.{table that returns a snapshot current date for the source system
date}
Configuration Local Configuration defines the information
source for the current row – available
values are:
- ModelBank: this entry has been
added to satisfy Temenos core
banking mapping and/or business
rules
- Local: the entry is used during
the implementation to update or
enhance Framework or
ModelBank functionality
- Framework: this entry has been
added to the TFS Framework
solution and it is core banking
agnostic
- PBModelBank: Used for Private
Banking record definitions
- CampaignAnalytics: Used for
Campaign Analytics solutions
- Predictive: Used for Predictive
Analytics solution when deployed

InsightSource:
You need to create schema a new schema for the data source in InsightSource. This should be the same
as the SourceName in InsightLanding..ExtractList.

IF NOT EXISTS (SELECT * FROM sys.schemas WHERE name = 'INS')


BEGIN
EXEC ('CREATE SCHEMA [INS] AUTHORIZATION [dbo]')
END

Update s_InsightSource_Synchronize_Schema and copy to post-deploy. This is so that when history is re-
processed, the v_source views for the new source system will not fail when run for dates before the data
existed.
For example, add this to s_InsightSource_Synchronzie_Schema

IF NOT EXISTS (SELECT * FROM sys.tables WHERE name = 'Customer')


BEGIN
Create Table INS.Customer
(CustomerID varchar(20), FieldTwo varchar(30), MISDate datetime)
END
IF NOT EXISTS (SELECT * FROM sys.tables WHERE name = 'Account')
BEGIN
Create Table INS.Account
(AccountID varchar(30),CustomerID varchar(20), FieldTwo varchar(30), MISDate
datetime)

Page 191 | 335


Advanced Analytics Platform Technical Guide

END

InsightETL (Rules Engine)


Assuming that existing rule definitions are used, you should add mappings for new Rules Values using the
Analytics Front end where appropriate.

InsightStaging
You should add the new v_source views, v_SourceCustomerINS, and v_SourceAccountINS.

Then you should add new corresponding extract.substep and add.substep records to the Update Order
table.
A partial example is provided in the table below:

UpdateOrd Configurati TableName SourceSys TableTyp Action_ Enabled Exclud


er on tem e _ e
10610101 Configuration DimAccount INS INS Extract.Subste 1 0
p
10610104 Configuration DimAccount INS INS Add.Substep 1 0
10610201 Configuration DimCustomer INS INS Extract.Subste 1 0
p
10610204 Configuration DimCustomer INS INS Add.Substep 1 0

Add to Systems table. Enable when ready to test.


InsightWarehouse
Add new fields to the InsightWarehouse Data Dictionary table if required.

Page 192 | 335


Advanced Analytics Platform Technical Guide

InsightWarehouse
Overview
The Insight Warehouse database is the end point for all data to be stored for analytical reporting and ad
hoc analysis. It stores multiple dates of data for multiple source systems in a star schema dimensional
model based on Kimball data warehousing methodology. This model is optimized for query performance
and the storage of large amounts of both transactional and snapshot data.

Temenos Analytics in previous releases loaded data in Dimension and Fact tables using traditional database
storage of tables in rowstore format. In rowstore data is logically organized as a table with rows and
columns, and then physically stored in a row-wise data format.

As data volumes increased storing data in InsightWarehouse in row store format translated to higher
amounts of disk space required.

Microsoft SQL Server introduced columnstore index which is a technology for storing, retrieving and
managing data by using a columnar data format. This new type of index stores data column-wise instead
of row-wise.

Temenos Analytics has adopted and implemented columnstore index technology in InsightWarehouse. The
rowstore primary key index has been replaced with a clustered columnstore index on all Bridge and Fact
tables.

The main benefits of using columnstore index in InsightWarehouse is high compression rates and high
query performance gains over traditional row-oriented storage. Requirements for disk space have been
dropped by up to 90%.

Specific Features / Functions


Feature Description
Dimensional Model All data is stored in a star schema dimensional model based on the
Kimball dimensional modeling approach relying on Fact,
Dimension, Bridge tables, etc. The model utilizes slowly changing
conformed dimensions to optimize data storage and query
execution and allows flexible management of slowly changing
dimension Attributes. Both Type 1 SCD (e.g. for Tax ID or Birth
Date) and Type 2 SCD (e.g. for. Customer Branch, Product Type)
can be used.
Abstraction Layer Abstraction layers are provided for end users to consume data for
reports and other uses. E.g. Materialized views for GL, customer,
account, transaction, etc.
These layers reduce the complexity of core banking using simple
table structures. By removing much of the dimensional model
complexity, they significantly simplify the joins required to
combine data and removes the necessity of the data consumer to
have intimate knowledge of how to query a dimensional model.
Extensible design It is relatively simple to add new columns, business rules, tables
or source system to the existing model.

Page 193 | 335


Advanced Analytics Platform Technical Guide

Metadata Driven The data warehouse is controlled and populated based entirely on
the metadata contained in the data dictionary configuration table.
This makes it easy to add additional columns to tables and the
abstraction views and control the slowly changing dimension
attributes of dimension table columns.

Page 194 | 335


Advanced Analytics Platform Technical Guide

Technical Details
Architecture
In the figure below, we can see how the InsightSource database fits in the Advanced Analytics platform’s
architecture.

Figure 30 - InsightWarehouse in the Core Analytics ETL Flow

Page 195 | 335


Advanced Analytics Platform Technical Guide

Data Model
The diagram below offers a representation of the InsightWarehouse data model.

Standard Dimension tables are represented in green.

Role Playing Dimension are dimensions which can play different roles in a fact table depending on the
context – e.g. times appear in most types of analysis because business activities happen in a timeframe
and objects exist in time. Time is almost always used when calculating and evaluating measures in a fact
table and the time information may be required in a different format such as ‘Time on Day’ or ‘Time in
minutes’ or ‘Time with AM or PM’. To define different views of the same dimension, we use here a particular
kind of Dimension tables called Role Playing Dimension tables and that are represented in pink.

Transaction Fact tables are represented in purple. The grain associated with this kind of fact table is
usually specified as "one row per line in a transaction", e.g., every line on a receipt. Typically a transactional
fact table holds data of the most detailed level, causing it to have a great number of dimensions associated
with it

There are also tables in charge of representing business performance at the end of each regular, predictable
time period. These are called Periodic Snapshot Fact tables, they are here represented in blue and, as
their name suggest, they are used to take a "picture of the moment", where the moment could be any
defined period of time, e.g. a performance summary of product sales over the previous month. Any periodic
snapshot table is dependent on an underlying transaction fact table, as it needs the detailed data held in
the transactional fact table in order to deliver the chosen performance output

Finally, our diagram shows us a set of Bridge tables, represented in orange. A bridge table sits between
a fact table and a dimension table and is used to resolve many-to-many relationships between a fact and
a dimension. Bridge table will contain only two dimension columns, key columns in both dimensions . E.g.
Let us suppose there are two dimensions Customer and Account. A Bridge table can be created by joining
the two tables Customer and Account using dimension keys (e.g. the Customer Number on the Account
Table). Please note that Bridge tables are fact-less, with no measures.

Page 196 | 335


Advanced Analytics Platform Technical Guide

FactAccount DimEmployee DimBranch DimAddress DimCustomer FactCustomer DimOrganisation DimProduct

PK,FK1 AccountId PK EmployeeId PK BranchId PK AddressId PK CustomerId PK,FK7 CustomerId PK OrganisationId PK ProductId
PK,FK11 BusinessDate PK,FK8 BusinessDate
I1 SourceEmployeeId I1 SourceBranchId I2,I1 SourceAddressId I2,I1 SourceCustomerId I2,I1 SourceOrganisationId I2,I1 SourceProductId
FK4 AddressId FK1 BranchID I1 Active I1 Active I1 Active FK6 BranchId OrganisationType SourceSystem
FK3 AddressId2 I1 Active Added Added Added FK5 BranchId2 SourceSystem I1 Active
FK5 BranchId Added Modified Modified Modified FK9 EmployeeId I1 Active Added
FK6 BranchId2 Modified UpdateReason UpdateReason UpdateReason FK10 EmployeeId2 Added Modified
FK7 BranchId3 UpdateReason BranchName AddressLine1 Age FK2 AddressId Modified UpdateReason
FK8 CurrencyID Division BranchNum AddressLine2 AgeGroup FK4 AddressId2 UpdateReason DeletedSCD2
FK9 CurrencyID2 Department CurrencyCompany AddressLine3 AnnualIncomeGroup FK3 AddressId3 DeletedSCD2 SourceApplication
FK10 CustomerId EmplFullName CurrencyMnemonic AddressLines BirthDate FK1 AddressId4 SourceApplication AccountClass
FK14 EmployeeId DimAccount EmplLastName LeadCompany AddressType CreditScore FK11 IndividualId IsActive FactProduct AssetsHierarchylevel1
FK16 EmployeeId2 EmployeeNum LocalCurrency City CreditScoreCompany FK12 OrganisationId OrganisationCodeLevel1 AssetsHierarchylevel2
PK AccountId EmployeePosition Country FK13 SystemSecurityId OrganisationCodeLevel2 PK,FK5 ProductId AssetsHierarchylevel3
FK18 EmployeeId3 PresentationCurrency CreditScoreDate
EmployeeStatus PostalCode CustMonthlyCosts OrganisationCodeLevel3 PK,FK3 BusinessDate AssetsHierarchylevel4
FK12 EmployeeId4 Region CreditScoreGroup
I1,I2 SourceAccountId EmployeeType Province CustMonthlyFeeIncome OrganisationCodeLevel4 AssetsHierarchylevel5
FK15 EmployeeId5 SourceApplication CustClosedDate
SourceSystem EmplStartDate SourceApplication CustMonthlyNetIncome OrganisationCodeLevel5 FK1 BranchId AssetType
FK20 GLId I1 SCD2HashCI CustClosedThisMonth
IsActive NetworkUserName I1 SCD2HashCI CustMonthlySpreadIncome OrganisationCodeLevel6 FK2 CurrencyId AssetTypeCode
FK21 LimitId SCD2HashCS CustClosedToday
I1 Active ParentLevel1 SCD2HashCS DepsBalance OrganisationDescLevel1 FK4 EmployeeId CatAvailDate
FK2 LinkedLoanAcctId CustNewThisMonth
Added ParentLevel1Code ExternalTotalAssets OrganisationDescLevel2 FK7 ThirdPartyId Category
FK22 OrganisationId CustNewToday
Modified ParentLevel2 ExternalTotalLiabilities OrganisationDescLevel3 FK8 ThirdPartyId2 CatExpiryDate
FK23 PortfolioId CustomerClass
UpdateReason ParentLevel2Code LoanAuthorized OrganisationDescLevel4 FK9 ThirdPartyId3 Classification
FK25 ProductId CustomerIndustry
DeletedSCD2 ParentLevel3 LoanBalance OrganisationDescLevel5 FK6 SystemSecurityId Group_
FK24 ProductId2 CustomerNum
AccountClass ParentLevel3Code MonthlyNumNonClassifiedTrans OrganisationDescLevel6 CurrencyId2 IsActive
FK19 ThirdPartyId CustomerSector
AccountNum ParentLevel4 NonClassifiedBal I1 SCD2HashCI IssueDate
FK13 ThirdPartyId2 CustomerStatus
AmortMatureDate ParentLevel4Code NumAccounts SCD2HashCS ProdGroupCode
FK17 ThirdPartyId3 CustomerTarget
Authorized BridgeAccountCustomer
FK27 WMId ParentLevel5 CustomerType NumNonClassified ProdGroupDesc
BalanceGroup ParentLevel5Code NumProdsAndServices ProdGroupType
FK26 SystemSecurityId PK,FK3 BusinessDate CustProfitGroup
BrokerName SourceApplication NumProduct1 ProdLineCode
AvailableFunds PK,FK1 AccountId CustProfitStatus
Category I1 SCD2HashCI NumProduct10 ProdLineDesc
Balance PK,FK2 CustomerId CustSourceSystem
Classification SCD2HashCS NumProduct2 Product
BookValue PK JointType CustStartDate
ClosedDate NumProduct3 ProductCategory
DelinquentAmount DeceasedDate
Currency NumProduct4 ProductCategoryCode
DelinquentDays EmployeeId DefaultPhone
DelqDayGroup NumProduct5 ProductCode
DisburseAmount IsSigner DepsBalGroup
DisburseDate NumProduct6 ProductDesc
ForeignCurrencyBal FirstName
ExternalRiskCode NumProduct7 ProductStatus
HoldsTotal FullName
ExternalRiskDesc NumProduct8 ProductType
InterestAccrued Gender
FeePlan NumProduct9 RiskCountry
InterestRate HasNonClassified
FixOrVar NumProducts RiskCountryCode
InterestRateVariance HasProduct1
FTPStartDate NumService1 RiskCurrency
IntIncomeOrExpense HasProduct10
Group_ NumService2 RiskCurrencyCode
LastPaymentDate HasProduct2
InterestPaidFreq NumService3 RiskLevel
MonthlyAmortizedSalesNetIncome HasProduct3
InterestRateGroup DimDate NumService4 SourceBranchID
MonthlyAvgBal HasProduct4
InterestRateIndex NumService5 SubAssetType
MonthlyCosts PK BusinessDate HasProduct5
IsClosedThisMonth NumServices SubAssetTypeCode
MonthlyFeeIncome HasProduct6
IsClosedToday Product10Balance I1 SCD2HashCI
MonthlyNetIncome DateFormat HasProduct7
IsNewThisMonth Product1Balance SCD2HashCS
MonthlySalesNetIncomePaid Year_ HasProduct8
IsNewToday Product2Balance
MonthlySpreadIncome Month_ HasProduct9
IsOverdrawn Product3Balance
MonthlyTrailerNetIncome YearMonth HasService1
IsSold Product4Balance
NextPmtDueDate YearMonthName HasService2
LoanCode Product5Balance
OverdrawnAmount YearMonthDayName HasService3
LoanDescription Product6Balance
ScheduledPmtAmt MonthName HasService4
LoanToValue Product7Balance
SpreadRate MonthNameYear HasService5
MaturityDate Product8Balance
TransferRate WeekDay IsEmployee
OriginalLoanAmount Product9Balance
WeekDayOrder LastName
OriginalStartDate ShareOfWalletPercentage
Day_ LoanBalGroup
PmtFreq SourceSystemSecurityId
Quarter MaritalStatus
PriCollCode TotalBalance
EndOfMonth MiddleName
PriCollDesc CustLoyaltyScore
I1 CurrentDate NationalIdentityNum DimCampaign DimChannel FactOpportunity
ProductCode AnnualBonus
HasGoodData NonResident
DimWM ProductDesc PK CampaignId BridgeCampaignChannel PK ChannelId PK OpportunityId
ShowData NumProductsGroup
ProductType
PK WMId EndOfMonthName Occupation
PurposeCode I1,I2 SourceCampaignId PK,FK3 BusinessDate I2,I1 SourceChannelId SourceOpportunityId
CurrentDateName Religion
PurposeDesc I1 Active PK,FK1 CampaignId I1 Active FK1 CustomerId
I1 SourceWMId CurrentWeek Residence
ReasonClosed Added PK,FK2 ChannelId Added FK2,I1 BusinessDate
I1 Active BSEndOfMonth SourceApplication
ReviewDate BridgeCustomerCollateral Modified Modified FK3 SystemSecurityId
Added FinancialYear SourceEmployeeId
RiskCode UpdateReason UpdateReason
Modified FinancialQuarter Tenure PK,FK3 BusinessDate
RiskDescription I1 SCD2HashCI I1 SCD2HashCI
UpdateReason FinancialMonth TenureGroup PK,FK2 CustomerId
SoldPoolNum SCD2HashCS SCD2HashCS
FundFamily FinancialWeek TotalBalGroup PK,FK1 CollateralId
SourceApplication
FundType CustAttritionRisk
SourceCustomerID
InvestmentType CustLoyaltyGroup
SourceEmployeeId DimProgram DimProfile FactOpportunitySuccess
LoadTypeDesc EmailAddress
SourcePortfolioId
Symbol I1 SCD2HashCI PK ProgramId PK ProfileId PK OpportunitySuccessId
SourceProductId
I1 SCD2HashCI SCD2HashCS PK,FK5 BusinessDate
SourceProductId2 BridgeOpportunityProfile
SCD2HashCS AnnualBonusGroup I1,I2 SourceProgramId I2,I1 SourceProfileId
StandardizedTerm
StartDate I1 Active I1 Active PK,FK1 BusinessDate SourceOpportunitySuccessId
StatementDesc Added Added PK,FK3 OpportunityId FK7 OpportunityId
Status_ BridgeCustomerLimit BridgeCustomerAsset Modified Modified PK,FK2 ProfileId FK4 CustomerId
DimGL UpdateReason UpdateReason FK1 AccountId
SystemSource
T24ProductGroup PK,FK2 BusinessDate PK,FK3 BusinessDate IsActive I1 SCD2HashCI FK2 CampaignId
PK GLId PK,FK1 CustomerId PK,FK2 CustomerId
Term I1 SCD2HashCI SCD2HashCS FK3 ChannelId
TermInMonths PK,FK3 LimitId PK,FK1 AssetId SCD2HashCS FK6 SystemSecurityId
I1 SourceGLId
GLSourceSystem TermToMaturity
I1 Active TermToMaturityGroup
Added TermUnit
Modified I1 SCD2HashCI
UpdateReason SCD2HashCS
GLAssetType LDCategoryProduct BridgeCustomerPortfolio BridgeCustomerSDB
GLBSAttribute1
PK,FK3 BusinessDate PK,FK3 BusinessDate
GLBSAttribute10
PK,FK2 CustomerId PK,FK2 CustomerId
GLBSAttribute2
PK,FK4 PortfolioId PK,FK4 SDBId
GLBSAttribute3
GLBSAttribute4 BridgeAccountSDB
FK1 BranchId FK1 BranchId
GLBSAttribute5 PK,FK3 BusinessDate
GLBSAttribute6 PK,FK1 AccountId
GLBSAttribute7 PK,FK4 SDBId
GLBSAttribute8
GLBSAttribute9 FK2 BranchId BridgeCustomerCard FactCollateral
GLCurrency
GLDescription PK,FK3 BusinessDate PK,FK3 CollateralId
GLInsightAttribute1 PK,FK1 CardId PK,FK4 BusinessDate
GLInsightAttribute10 BridgeAccountCollateral PK,FK2 CustomerId
GLInsightAttribute2 FK2 AssetId
PK,FK4 BusinessDate FK4 EmployeeId
GLInsightAttribute3 FK1 AddressId
PK,FK1 AccountId
GLInsightAttribute4 CollAmount
PK,FK3 CollateralId
GLInsightAttribute5 ExecutionValue
GLInsightAttribute6 MaximumValue
FK2 AddressId
GLInsightAttribute7 SystemSecurityId
CollPrctAlloc
GLInsightAttribute8 DimActivity DimEvent DimOrder DimSecurityPosition DimThirdParty
GLInsightAttribute9
PK ActivityId PK EventId PK OrderId PK SecurityPositionId PK ThirdPartyId
GLNum
GLThirdPartyAttribute1
I2,I1 SourceActivityId I2,I1 SourceEventId I2,I1 SourceOrderId I2,I1 SourceSecurityPositionId I2,I1 SourceThirdPartyId
GLThirdPartyAttribute10
SourceSystem SourceSystem SourceSystem SourceSystem SourceSystem
GLThirdPartyAttribute2 DimCollateral DimAsset DimLimit IsActive IsActive I1 Active I1 Active I1 Active
GLThirdPartyAttribute3
I1 Active I1 Active Added Added Added
GLThirdPartyAttribute4 PK CollateralId PK AssetId PK LimitId
Added Added Modified Modified Modified
GLThirdPartyAttribute5
Modified Modified UpdateReason UpdateReason UpdateReason
GLThirdPartyAttribute6 I1 SourceCollateralId I1 SourceAssetId I1 SourceLimitId
UpdateReason UpdateReason DeletedSCD2 DeletedSCD2 DeletedSCD2
GLThirdPartyAttribute7 DimTime I1 Active IsActive IsActive
DeletedSCD2 DeletedSCD2 SourceApplication SourceApplication BranchID
GLThirdPartyAttribute8 Added I1 Active I1 Active
SourceApplication SourceApplication LeadCompany I1 SCD2HashCI SourceApplication
GLThirdPartyAttribute9 DimCard DimCurrency PK Time_ Modified Added Added
ActivityHierarchylevel1 I1 SCD2HashCI OrderCode SCD2HashCS DefaultPhone
PLCategory UpdateReason Modified Modified
PK CardId PK CurrencyId ActivityHierarchylevel2 SCD2HashCS OrderDealStatus LeadCompany
SourceApplication TimeFormat ApplicationRef UpdateReason UpdateReason
ActivityHierarchylevel3 OrderGroupcode Residence
SourceCustomerID TimeOfDay AppraisalDate DeletedSCD2 DeletedSCD2
I1 SourceCardId I1 SourceCurrencyId ActivityHierarchylevel4 OrderIsSTP SourceAddressId
SystemSource TimeOfDayOrder CollateralCode AccountId ApprovalDate
I1 Active IsActive ActivityHierarchylevel5 OrderLimitType SourceBranchId
GLPrevAssetType AmPm CollateralDesc CollateralId ExpiryDate
Added I1 Active ActivityType OrderMarketType ThirdPartyIndustry
PLConsolKey Hour CollateralType SourceAccountId IsAvailable
Modified Added ActivityTypeCode OrderTransCode ThirdPartyName
PLResidence HourAmPM CollCurrency SourceApplication IsRevolving
UpdateReason Modified SourceAccountId OrderTransDesc ThirdPartyNum
PLSector Minutes CollExpiryDate SourceCollateralId IsSecured
ATMOffline UpdateReason SourceBranchID OrderType ThirdPartySector
PlTerm MinutesAMPM CollInsuranceCompany I1 SCD2HashCI LimitCurrency
ATMOnLine DeletedSCD2 SourceCustomerID SourceAccountId ThirdPartyStatus
I1 SCD2HashCI TimeRange15Min CollInsuranceExpiryDate SCD2HashCS LimitFixOrVar
CardNumber CurrencyCodeFrom SourceEmployeeId SourceBranchId ThirdPartyTarget
SCD2HashCS CollInsurancePolicyNum LimitProdCode
CardStatus CurrencyCodeTo SourceEmployeeId2 SourceCustomerId ThirdPartyType
CollInsuranceType LimitProdDesc
CardType CurrencyMarketCode SourceEmployeeId3 SourceEmployeeId I1 SCD2HashCI
CollPropertyCode BridgeLimitCollateral FactLimit LimitRef
ExpiryDate CurrencyMarketDescription SourceEventId SourceEventId SCD2HashCS
CollStartDate LimitStartDate
IssueDate SourceApplication PK,FK3 LimitId SourcePortfolioId SourceProductId
FactGLAdjmt FactGL CollStatus PK,FK1 BusinessDate ReviewFrequency
POSOffline I1 SCD2HashCI PK,FK2 BusinessDate SourceProductId SourceSecurityPositionId
SourceApplication PK,FK2 LimitId SourceApplication
PK GLAdjmtId PK,FK9 GLId POSOnline SCD2HashCS SourceThirdPartyId SourceThirdPartyId
I1 SCD2HashCI I1 SCD2HashCI
PK BusinessDate SourceApplication FK1 CustomerId SourceThirdPartyId2 SourceThirdPartyId2
SCD2HashCS LimitPrctAlloc SCD2HashCS
BusinessDate I1 SCD2HashCI InternalAmount SourceThirdPartyId3 SourceThirdPartyId3
GLAdjmtPostedDate FK3 BranchId SCD2HashCS LimitAmount I1 SCD2HashCI SystemSource
SourceGLAdjmtId FK2 BranchId2 MaxLimitAmount SCD2HashCS I1 SCD2HashCI
FK2 GLId FK1 AccountId SystemSecurityId SCD2HashCS
SourceGLId FK7 CustomerId UtilisedAmount
SourceGLTranId FK8 EmployeeId
FK1 EmployeeId FK4 CurrencyID
SystemSecurityId FK6 CurrencyID2 FactCurrency
FactActivity FactEvent FactOrder FactSecurityPosition FactBrokerTrade
GLAdjmtAmount FK5 CurrencyID3 PK CurrencyId
GLAdjmtForeignCcy FK10 OrganisationID BridgeAccountPortfolio PK,FK3 ActivityId PK,FK5 EventId PK,FK14 SecurityPositionId PK BrokerTradeId
PK,FK1 BusinessDate PK,FK11 OrderId
GLAdjmtForeignAmt FK11 SystemSecurityId PK,FK7 BusinessDate
PK,FK3 BusinessDate
CurrencyID EmployeeId2 BranchId FK12,I1 BusinessDate FK4,I1 BusinessDate FK7,I1 BusinessDate I1 BusinessDate
PK,FK1 AccountId
CurrencyID2 GLAmount MidRevalRate FactIndividual FK1 AccountId FK1 BranchId FK1 AccountId FK2 AccountId SourceBrokerTradeId
PK,FK4 PortfolioId DimIndividual
GLForeignAmount MultiplierRate FK2 AccountId2 FK3 CurrencyID FK2 ActivityId FK1 AccountId2 FK1 AccountId
PortfolioId PK,FK7 IndividualId PK IndividualId FK4 BranchId FK2 CurrencyID2 FK3 AddressId FK2 ActivityId
FK2 BranchId RevalRate FK3 BranchId
ProductId PK,FK3 BusinessDate FK5 BranchId2 FK6 ProductId FK4 AddressId2 FK3 BranchId
FK5 CurrencyId
sourceIndividualId FK6 CardId FK9 ThirdPartyId FK4 CurrencyId2 FK6 BranchId FK4 CurrencyId
FK2 PrimaryCustomerId Active FK7 CardId2 FK8 ThirdPartyId2 FK5 BranchId2 FK5 CurrencyId2
FK6 CustomerId
FK1 BranchId Added FK8 CurrencyId FK7 SystemSecurityId CurrencyID FK6 CustomerId
DimPortfolio FK8 EmployeeId
FK6 EmployeeId Modified FK9 CurrencyId2 SourceEventId CurrencyID2 FK7 CustomerId2
FK9 EventId
PK PortfolioId FK5 EmployeeId2 UpdateReason FK10 CustomerId CurrencyID3 FK8 EmployeeId
FK10 GLId
FK4 EmployeeId3 FK11 CustomerId2 CustomerId FK9 EmployeeId2
FK13 ProductID
I1 SourcePortfolioId SystemSecurityId FK14 EmployeeId FK10 EmployeeId FK11 OrderId
BridgeAccountCard FK12 PortfolioID
I1 Active FK15 EmployeeId2 FK14 SecurityPositionID FK9 EmployeeId2 FK10 GLId
Added PK,FK3 BusinessDate FK13 EmployeeId3 FK17 ThirdPartyId FK8 EmployeeId3 FK12 PortfolioId
Modified PK,FK1 AccountId FK16 EventId FK16 ThirdPartyId2 GLId FK13 ProductId
UpdateReason PK,FK2 CardId DimSDB FK17 GLId FK15 ThirdPartyId3 FK11 LimitId FK16 ThirdPartyId
PortfolioType FK18 LimitId OrderDate FK12 PortfolioId FK15 ThirdPartyId2
PortfolioCategory FK4 EmployeeId PK SDBId FK13 ProductId FK17 ThirdPartyId3
FK19 OrderId OrderTime
PortfolioClosedDate FK21 PortfolioId OrderBuyingPower FK18 ThirdPartyId FK14 SystemSecurityId
PortfolioDesc SourceSDBId FK17 ThirdPartyId2 BrokerTradeDate
FK20 PortfolioId2 OrderLimitExpDate
PortfolioNum Active FK16 ThirdPartyId3 SourceApplication
FK23 ProductId OrderLimitPrice
PortfolioStartDate Added FK19 WMId SourceProductID
FK22 ProductId2 OrderMaturityDate
PortfolioStatus Modified FK15 SystemSecurityId BrokerTradeTime
FK25 SecurityPositionId OrderNominal
SourceApplication UpdateReason BranchId3 SourceThirdPartyId
FK24 SecurityPositionId2 OrderValueDate
I1 SCD2HashCI BoxNum SourceThirdPartyId2
FK26 SystemSecurityId SourceOrderId
SCD2HashCS DimSystemSecurity Fee SourceThirdPartyId3
FK27 ThirdPartyId
KeyNum
PK SystemSecurityID FK29 ThirdPartyId2
NextDueDate
FK28 ThirdPartyId3
Size
I1 SourceSystemSecurityID LinkedLoanAcctID
StartDate
Description SourceActivityId
Status
FactPortfolio Added
PK PortfolioId I1 Active
PK,FK2 BusinessDate UpdateReason
Modified
FK1 CustomerId BranchRole FactSDB
SystemSecurityId CustSensitivityRole
DomainName PK SDBId
AccountBranchRole PK,FK1 BusinessDate
I1 SCD2HashCI
SCD2HashCS BranchID
CustomerID
FeeAccountID

FactAcctTran FactAcctTranSub FactGLTran

PK AcctTranId PK AcctTranSubId PK GLTranId

I2 BusinessDate SourceAcctTranSubID BusinessDate


SourceAcctTranId FK1 AcctTranId SourceGLTranId
FK1,I1 AccountId SourceAcctTranID FK1 AccountId
FK2 ActivityId FK2 ActivityId
FK3 BranchId FK3 BranchId
FK4 CardId FK4 CardId
FK6 CurrencyID FK7 CurrencyID
FK7 CurrencyID2 FK6 CurrencyID2
FK5 CurrencyID3 FK5 CurrencyID3
FK8 CustomerID FK8 CustomerId
FK11 EmployeeId FK11 EmployeeId
FK9 EmployeeId2 FK10 EmployeeId2
FK10 EmployeeId3 FK9 EmployeeId3
FK12 EventId FK12 EventId
FK13 GLId FK13 GLId
FK16 ProductId FK14 OrganisationId
FK15 PortfolioId FK16 ProductId
FK14 OrganisationId FK15 PortfolioId
FK17 SecurityPositionId FK17 SecurityPositionId
FK21 ThirdPartyId FK19 ThirdPartyId
FK19 ThirdPartyId2 FK21 ThirdPartyId2
FK20 ThirdPartyId3 FK20 ThirdPartyId3
TransactionTime SourceAccountId
TransactionDate SourceAcctTranId
EffectiveDate SourceApplication
SourceAccountId GLTranEffectiveDate
SourceApplication GLTransactionTime
FK18 SystemSecurityId GLTransactionDate
DebitOrCredit FK18 SystemSecurityId
EBSystem EBSystem
EBSystemCode EBSystemCode
FlowSubType GLTranAmount
FlowType GLTranCurrency
ForeignExchangeRate GLTranForeignCurrencyAmount
InterestAmount PLCateg
IsCustTran PLCategCode
PrincipalAmount ProdCateg
SourceActivityId ProdCategCode
SourceCustomerID ReversalFlag
SourceEmployeeId SourceActivityId
sourceEmployeeId2 SourceBranchID
SourceEventId SourceEmployeeID
SourceProductId sourceEmployeeId2
SystemSource SourceEventID
SysTranChannel SourceGLId
SysTranType SourceProductId
TraceNumber SystemSource
TranAmount TranDescription
TranAmountGroup
TranChannel
TranCurrency
TranDescription
TranForeignCurrencyAmount
TranInitiation
TransactionCode
TransactionNarrative
TransactionType
TranServiceCharge

Figure 31 - InsightWarehouse Data Model

Page 197 | 335


Advanced Analytics Platform Technical Guide

Abstraction Views
v_BrokerTrade v_Activity v_Order v_DAO v_Product v_SecurityPosition
v_Customer v_Account v_AcctTran v_GL v_GLTran v_CustomerLimit v_Limit v_Collateral v_Channel
AccountId AccountId AccountId DAOId BranchId AccountId
BranchId AccountId AccountId AccountID AccountId BusinessDate BusinessDate BusinessDate ChannelId AccountId2 ActivityId SourceDAOId BusinessDate AddressId
ActivityId
BusinessDate BranchId AcctTranId BranchId ActivityId CustomerId CustomerId CollateralId SourceChannelId ActivityId BranchId DAODescLevel1 CurrencyId AddressId2
BranchId
CustBranchId BranchId2 ActivityId BusinessDate BranchId LimitId LimitId SourceApplication BranchId BusinessDate DAODescLevel2 CurrencyId2 BranchId
BrokerTradeDate
CustEmplId BusinessDate BranchId CurrencyId BusinessDate SourceApplication SourceCollateralId BranchId2 CurrencyId DAODescLevel3 EmployeeId BranchId2
BrokerTradeId
CustomerId CurrencyId CardId CurrencyId2 CustomerId SourceLimitId AppraisalDate BusinessDate CurrencyId2 DAODescLevel4 ProductId BranchId3
BrokerTradeTime
EmployeeId CurrencyId2 CustomerId CurrencyId3 EmployeeId ApprovalDate CollateralCode CardId CustomerId DAODescLevel5 SourceApplication BusinessDate
v_CustomerAsset BusinessDate
OrganisationId CustomerId EmployeeId CustomerId EventId LimitStartDate CollateralDesc CardId2 EmployeeId DAODescLevel6 SourceProductId CurrencyId
v_Asset CurrencyId
SourceApplication EmployeeId EventId EmployeeId GLId AssetId ExpiryDate CollateralType CurrencyId EventId DAOCodeLevel1 SystemSecurityId CurrencyId2
CurrencyId2
SourceCustomerId EmployeeId2 GlId GLBranchId GLTranBranchId BusinessDate LimitCurrency CollInsuranceCompany SourceApplication CurrencyId2 GLId DAOCodeLevel2 ThirdPartyId CurrencyId3
CustomerId
AddressId EmployeeId3 OrganisationId GLId GLTranId CustomerId LimitProdCode CollInsuranceExpiryDate CustomerId OrderId DAOCodeLevel3 ThirdPartyId2 CustomerId
CustomerId2
AddressId2 EmployeeId4 PortfolioId OrganisationId OrganisationId LimitProdDesc CollInsurancePolicyNum CustomerId2 PortfolioId DAOCodeLevel4 ThirdPartyId3 EmployeeId
EmployeeId
AddressId3 EmployeeId5 ProductId SourceApplication PortfolioId UtilisedAmount CollInsuranceType EmployeeId ProductId DAOCodeLevel5 AssetsHierarchylevel1 EmployeeId2
EmployeeId2
AddressId4 GLId SecurityPositionId SourceGLId ProductId LimitAmount CollPropertyCode EmployeeId2 SecurityPositionId DAOCodeLevel6 AssetsHierarchylevel2 EmployeeId3
GLId
Age IsActive SourceAcctTranId GLAssetType SourceApplication InternalAmount SystemSecurityId EmployeeId3 SourceApplication AssetsHierarchylevel3 GlId
v_Card OrderId
AgeGroup LimitId SourceApplication GLBranchName SourceGlTranId SystemSecurityId CollStartDate EventId SourceOrderId AssetsHierarchylevel4 LimitId
v_ThirdParty PortfolioId
AnnualIncomeGroup LinkedLoanAcctID TranEmplId GLBranchNum AcctTranId BusinessDate LimitRef CollExpiryDate GLId SystemSecurityId AssetsHierarchylevel5 PortfolioId
ProductId
BirthDate OrganisationId TransactionTime GLBSAttribute1 CurrentDate CardId IsRevolving CollCurrency LimitId ThirdPartyId AssetType ProductId
SourceApplication SecurityPositionId
CreditScore PortfolioID BusinessDate GLBSAttribute10 GLTranAmount CustomerId IsAvailable CollStatus LinkedLoanAcctID ThirdPartyId2 AssetTypeCode SecurityPositionId
SourceThirdPartyId SourceApplication v_Organisation
CreditScoreCompany ProductId DebitOrCredit GLBSAttribute2 GLTranBranchName SourceApplication IsSecured MaximumValue OrderId ThirdPartyId3 IssueDate SourceApplication
ThirdPartyId SourceBrokerTradeId
CreditScoreDate ProductId2 EffectiveDate GLBSAttribute3 GLTranBranchNum SourceCardId ReviewFrequency ExecutionValue PortfolioId LeadCompany OrganisationId Product SourceSecurityPositionId
ThirdPartyName SystemSecurityId
CreditScoreGroup SourceAccountId ForeignExchangeRate GLBSAttribute4 GLTranEffectiveDate ATMOffline LimitFixOrVar ApplicationRef PortfolioId2 OrderBuyingPower SourceOrganisationId ProductCategory SystemSecurityId
DefaultPhone ThirdPartyId
CustAddressLine1 SourceApplication InterestAmount GLBSAttribute5 GLTranEmplFullName ATMOnLine MaxLimitAmount CollAmount ProductId OrderCode OrganisationDescLevel1 ProductCategoryCode ThirdPartyId
LeadCompany ThirdPartyId2
CustAddressLine2 WMId PrincipalAmount GLBSAttribute6 GLTranEmplId CardNumber ProductId2 OrderDate OrganisationDescLevel2 RiskCountry ThirdPartyId2
Residence ThirdPartyId3
CustAddressLine3 AccountClass SourceAccountId GLBSAttribute7 GLTranEmplNum CardStatus SecurityPositionId OrderDealStatus OrganisationDescLevel3 RiskCountryCode ThirdPartyId3
SourceAddressId SourceThirdPartyId
CustBranchName AccountNum SysTranChannel GLBSAttribute8 GLTranEmplRegion CardType SecurityPositionId2 OrderGroupcode OrganisationDescLevel4 RiskCurrency WMId
SourceBranchId SourceThirdPartyId2
CustBranchNum AcctBranchId SysTranType GLBSAttribute9 GLTranCurrency ExpiryDate v_LimitCollateral v_CustomerCollateral SourceActivityId OrderIsSTP OrganisationDescLevel5 RiskCurrencyCode SourceSystem
SourceSystem SourceThirdPartyId3
CustClosedDate AcctBranchName TraceNumber GLCurrency GLTranForeignCurrencyAmount IssueDate SourceApplication OrderLimitExpDate OrganisationDescLevel6 RiskLevel Partition
BusinessDate BusinessDate ThirdPartyIndustry
CustClosedThisMonth AcctBranchNum TranAmount GLDescription GLTranRegion POSOffline SystemSecurityId OrderLimitPrice OrganisationCodeLevel1 SourceBranchID
CollateralId CollateralId ThirdPartyNum
CustClosedToday AcctEmpl2BranchId TranAmountGroup GLInsightAttribute1 GLTransactionDate POSOnline ThirdPartyId OrderLimitType OrganisationCodeLevel2 SourceSystem
LimitId CustomerId ThirdPartySector
CustEmplBranchId AcctEmpl2BranchName TranBranchId GLInsightAttribute10 GLTransactionTime ThirdPartyId2 OrderMarketType OrganisationCodeLevel3 SubAssetType
LimitPrctAlloc ThirdPartyStatus
CustEmplBranchName AcctEmpl2BranchNum TranBranchName GLInsightAttribute2 SystemSecurityId ThirdPartyId3 OrderMaturityDate OrganisationCodeLevel4 SubAssetTypeCode
ThirdPartyTarget
CustEmplBranchNum AcctEmpl2FullName TranBranchNum GLInsightAttribute3 EBSystem ActivityHierarchylevel1 OrderNominal OrganisationCodeLevel5 Partition
ThirdPartyType v_Portfolio
CustEmplFullName AcctEmpl2Num TranChannel GLInsightAttribute4 EBSystemCode ActivityHierarchylevel2 OrderTime OrganisationCodeLevel6
CustEmplNetworkUserName AcctEmpl2Region TranCurrency GLInsightAttribute5 PLCateg v_AccountCollateral BusinessDate
v_AccountJoint ActivityHierarchylevel3 OrderTransCode
CustEmplNum AcctEmplBranchId TranDescription GLInsightAttribute6 PLCategCode CustomerId ActivityHierarchylevel4 OrderTransDesc
TranEmplBranchId GLInsightAttribute7 ProdCateg AccountId
CustEmplPosition AcctEmplBranchName AccountId PortfolioId ActivityHierarchylevel5 OrderType
TranEmplBranchName GLInsightAttribute8 ProdCategCode BusinessDate
CustEmplRegion AcctEmplBranchNum BusinessDate SourceApplication ActivityType OrderValueDate
TranEmplBranchNum GLInsightAttribute9 ReversalFlag CollateralId
CustMonthlyCosts AcctEmplFullName CustomerId SourcePortfolioId ActivityTypeCode SourceAccountId v_Employee
TranEmplFullName GLNum SourceAccountId CollPrctAlloc
CustMonthlyFeeIncome AcctEmplId JointType SystemSecurityId SourceAccountId SourceBranchId
CustMonthlyNetIncome AcctEmplId2 TranEmplNum GLRegion SourceAcctTranId IsSigner PortfolioType SourceCustomerID SourceCustomerId BranchId
CustMonthlySpreadIncome AcctEmplId3 TranEmplPosition GLSourceSystem SourceEventID SourceEmployeeId SourceEmployeeId EmployeeId
CustNewThisMonth AcctEmplId4 TranEmplRegion GLThirdPartyAttribute1 SourceGLId SourceEmployeeId2 SourceEventId SourceApplication
CustNewToday AcctEmplId5 TranForeignCurrencyAmount GLThirdPartyAttribute10 SourceProductId SourceEmployeeId3 SourceProductId SourceEmployeeId
CustomerClass AcctEmplNum TranRegion GLThirdPartyAttribute2 Company SourceEventId SourceSecurityPositionId BranchName
CustomerIndustry AcctEmplRegion TransactionCode GLThirdPartyAttribute3 SourceSystem SourceSystem EmplFullName
CustomerNum AcctRegion TransactionDate GLThirdPartyAttribute4 SourceThirdPartyId SourceThirdPartyId EmplLastName
CustomerSector AddressId TransactionNarrative GLThirdPartyAttribute5 SourceThirdPartyId2 SourceThirdPartyId2 EmployeePosition
CustomerStatus AddressId2 TransactionType GLThirdPartyAttribute6 SourceThirdPartyId3 SourceThirdPartyId3 EmployeeRegion
CustomerTarget AmortMatureDate TranInitiation GLThirdPartyAttribute7 SystemSource EmployeeStatus
CustomerType Authorized IsCustTran GLThirdPartyAttribute8 EmployeeType
CustProfitGroup AvailableFunds TranServiceCharge GLThirdPartyAttribute9 EmplStartDate
CustProfitStatus Balance SystemSecurityId PLCategory NetworkUserName
CustRegion BalanceGroup EBSystem SystemSecurityId ParentLevel1
CustSourceSystem BrokerName EBSystemCode GLUnAdjmtAmt ParentLevel1Code
CustStartDate Category FlowSubType GLUnAdjmtForeignAmt ParentLevel2
DeceasedDate Classification FlowType GLPrevAssetType ParentLevel2Code
DefaultPhone ClosedDate SourceProductId PLConsolKey ParentLevel3
DepsBalance Currency Company PLResidence ParentLevel3Code
DepsBalGroup DelinquentAmount PLSector ParentLevel4
FirstName DelinquentDays PlTerm ParentLevel4Code
FullName DelqDayGroup Company ParentLevel5
Gender DisburseDate GLActualAmount ParentLevel5Code
HasNonClassified ExternalRiskCode GLAdjmtAmount
v_AcctTranSub
HasProduct1 ExternalRiskDesc GLAdjmtForeignAmt
HasProduct10 FeePlan AcctTranId GLAmount
HasProduct2 FixorVar AcctTranSubId GLBudgetAmount
HasProduct3 ForeignCurrencyBal SourceAcctTranID GLForeignAmount
HasProduct4 FTPStartDate SourceAcctTranSubID
HasProduct5 Group_
HasProduct6 HoldsTotal
HasProduct7 InterestAccrued
HasProduct8 InterestPaidFreq
HasProduct9 InterestRate
HasService1 InterestRateGroup
HasService2 InterestRateIndex
HasService3 InterestRateVariance
HasService4 IntIncomeOrExpense
HasService5 IsClosedThisMonth v_Branch
IsEmployee IsClosedToday SourceApplication
LastName IsNewThisMonth Active
LoanAuthorized IsNewToday BranchId
LoanBalance IsOverdrawn BranchName
LoanBalGroup IsSold LeadCompany
MiddleName LastPaymentDate BranchNum
NationalIdentityNum LoanCode Region
NonClassifiedBal LoanDescription SourceBranchId
NonResident LoanToValue Company
NumAccounts MaturityDate
NumNonClassified MonthlyAmortizedSalesNetIncome
NumProdsAndServices MonthlyAvgBal
NumProduct1 MonthlyCosts
NumProduct10 MonthlyFeeIncome
v_OpportunitySuccess v_Opportunity v_Profile
NumProduct2 MonthlyNetIncome
NumProduct3 MonthlySpreadIncome v_Program
AccountId BusinessDate ProfileId
NumProduct4 NextPmtDueDate BusinessDate CustomerId SourceProfileId v_DataDictionary v_MetaData
SourceProgramId
NumProduct5 OriginalLoanAmount CampaignId OpportunityId v_SysSecVisibleRecords v_Time v_date
TableName ObjectType
NumProduct6 OriginalStartDate ChannelId SourceOpportunityId
SystemSecurityId AmPm BSEndOfMonth ColumnName TableName
NumProduct7 OverdrawnAmount CustomerId SystemSecurityId
Hour BusinessDate SourceSystem ColumnName
NumProduct8 PmtFreq OpportunityId
HourAmPm CurrentDate SourceTable DataType
NumProduct9 PriCollCode OpportunitySuccessId v_SDB Minutes CurrentDateName SourceColumn Length_
NumProducts PriCollDesc SourceOpportunitySuccessId v_SystemSecurity MinutesAmPm DateFormat Transformations PrimaryKey
NumProductsGroup ProductCode SystemSecurityId BusinessDate
v_OpportunityProfile Time_ Day_ Alias ForeignKey
NumService3 ProductDesc v_Campaign CustomerId SystemSecurityId
TimeFormat EndOfMonth SCDType Nullable
NumService4 T24ProductGroup SDBBranchId SourceSystemSecurityId
BusinessDate CampaignId TimeOfDay EndOfMonthName SchemaName Identity_
NumService5 PurposeCode SDBId BranchRole
OpportunityId SourceCampaignId TimeOfDayOrder FinancialMonth Computed
NumServices PurposeDesc SourceSDBId AccountBranchRole
ProfileId TimeRange15Min FinancialQuarter precision
Product10Balance ReasonClosed CustSensitivityRole
v_Event FinancialWeek scale
Product1Balance ReviewDate DomainName
FinancialYear ColumnOrder
Product2Balance RiskCode BranchId HasGoodData ComputeColumnDefinition
Product3Balance RiskDescription BusinessDate
Product4Balance ScheduledPmtAmt Month_
CurrencyId v_CampaignChannel MonthName
Product5Balance SoldPoolNum CurrencyId2
Product6Balance SourceSystem BusinessDate MonthNameYear
EventId Quarter
Product7Balance SpreadRate ProductId CampaignId
Product8Balance StandardizedTerm ChannelId ShowData
SourceApplication Weekday
Product9Balance StartDate SourceEventId
Religion StatementDesc WeekdayOrder
SystemSecurityId Year_
Residence Status_ ThirdPartyId
Tenure Term YearMonth
ThirdPartyId2 YearMonthDayName
TenureGroup TermInMonths SourceSystem
TotalBalance TermToMaturity YearMonthName
TotalBalGroup TermToMaturityGroup
SystemSecurityId TermUnit
Occupation TransferRate
MaritalStatus SystemSecurityId
SourceSystemSecurityId SourceProductId
CustAttritionRisk SourceProductId2
CustLoyaltyGroup ProdGroupCode
CustLoyaltyScore ProductStatus
EmailAddress CatAvailDate
Company CatExpiryDate
AnnualBonus ProdLineCode
AnnualBonusGroup ProdLineDesc
CustProfitSegment ProdGroupType
CustSegment ProductType
Partition ProdGroupDesc
Company
LDCategoryProduct
LocalCurrency
v_Address PresCcyMidRevalRate
PresCcyRevalRate
AddressId PresentationBalance
SourceAddressId PresentationCurrency
SourceApplication Partition
AddressLine1
AddressLine2
AddressLine3
AddressType
City
Country
PostalCode
Province

Figure 32 - InsightWarehouse Abstraction views

The abstraction views are what are ultimately exposed to end users of the data warehouse like report
developers and advanced end users with SQL knowledge. No reports should touch the tables directly. The
views hide a lot of the complexity of the dimensional model by joining the dimension and fact tables along
with most of the common role playing dimensions to provide a wide view of a single Insight concept like
Customer or Account.

Additionally, the views contain the joins for the data security layer so if a user is to have restricted roles
based access to data stored in the warehouse then they must access data using the views.

Almost all joins in the view layer should be made with the surrogate key (fields that end in Id without
source in front like AccountId) and the business date in order to properly utilize the indexing built into the
warehouse. There are some cases like querying over time that you will want to utilize the business key in
your query to account for changes in a dimension over time. The business keys are the fields that have a
source at the beginning and Id at the end like SourceAccountId.

Page 198 | 335


Advanced Analytics Platform Technical Guide

Technical Components
Tables
DataDictionary
The DataDictionary table is used to store the table and view definitions for InsightWarehouse. The
definitions stored in this configuration table are used by procedures in the data warehouse to automatically
create the warehouse tables and views.

Records are at the InsightWarehouse column granularity. Each source field being brought into
InsightWarehouse may have several records in the data dictionary depending on how many abstraction
layers the column is being displayed in and whether the column has been defined for more than one
configuration.

The data dictionary has a concept of configuration layers. Typically there are at least two if not more
configuration layers present in the data dictionary definitions. The first layer is the Framework layer which
contains a set of source system-independent columns that are required to build the skeletal structure of
the data model. This set consists of system columns such as primary and foreign keys and dimensional
table columns like the Added, Modified and Active columns. These columns are the minimum requirements
to run Analytics ETL.

The next layer consists of a configuration layer labeled as Core which is used to host data from any core
source system. This configuration layer contains all of the core columns that are provided out-of-the-box
with the Advanced Analytics Platform.

The next layer that can be included is for specific columns required for a particular banking system such as
Temenos Core Banking Model Bank e.g.the Modelbank layer.

Finally, there is a client configuration layer that is used to mark columns that are specific to a particular
bank’s implementation of Insight, i.e. the Local layer.

In addition to these basic configuration layers, more can be added if a client installs any optional module
in the Advanced Analytics Platform or in the Core banking source system. Some examples could be:

 PBModelBank: Used for Private Banking record definitions


 CampaignAnalytics: Used for Campaign Analytics solutions
 Predictive: Used for Predictive Analytics solution when deployed
In the table below, we find a description of the structure and columns of the DataDictionary table

Field Name Description


DataDictionaryid Record Id (identity column). Populated automatically.
Configuration Used to identify the configuration of the column, as explained above. – available
values are:
- ModelBank: this entry has been added to satisfy Temenos core banking
mapping and/or business rules
- Local: the entry is used during the implementation to update or enhance
Framework or ModelBank functionality
- Framework: this entry has been added by the TFS Framework solution
and it is core banking agnostic
- PBModelBank: Used for Private Banking record definitions

Page 199 | 335


Advanced Analytics Platform Technical Guide

- CampaignAnalytics: Used for Campaign Analytics solutions


- Predictive: Used for Predictive Analytics solution when deployed

ConfigVersion For future development. Not currently in use.


SchemaName The schema name for the column. If left NULL the system will use the default dbo
schema. The schema to be used to define cube views is Cubes.
ObjectName Deprecated. No longer in use.
TableName The table name for the table that the column will be stored in within
InsightWarehouse. If this is a view record then the view that the column will be
in.
ColumnName The name of the column that will be created in the table or view in
InsightWarehouse. For view fields, you can enter a different name from the source
column and it will be used as an alias.
ColumnType The data type for the column. This is only required for table columns.
Description_ A business description of the column.
SCDType The slowly changing dimension type for a dimension table column. The support
types are type 1 and type 2. Type 1 will overwrite the old value with the new and
not create a new record in the dimension table. Type 2 will create a new record
and mark the old record as inactive. This preserves the old value for older records.
ModuleOrder Deprecated. No longer in use.
SourceSystem Used for view records only. If set this to Direct, this column will enable calculations
in the view. This will allow you to enter a calculation in the Transformations
column.
SourceModule Deprecated. No longer in use.
SourceTable Used only for view records. Contains the name of the table or view from
InsightWarehouse that the view column is pointed to (e.g. abstraction views use
tables as sources while cube views will use abstraction views as sources). If adding
a record to a view from a role playing table then the syntax is
FactTable>RolePlayingTableA>RolePlayingTableB etc. See the section on adding
a record in the configuration for a detailed example.
SourceColumn Used only for view records. Contains the column name from the InsightWarehouse
table or view the record is pointed to.
Transformations Used for calculated view columns. Enter the calculation you want to perform in
the view for this particular view column. For example:
CASE WHEN [v_Customer].CustNewToday = 'Yes' THEN 'New' WHEN
[v_Customer].CustClosedToday = 'YEs' THEN 'Closed' ELSE 'Existing' END
Comments Free format comment section for providing comments about the column.
Enabled_ This column is used in connection with the Configuration column. Acceptable
values for Enable_ are 1 (which means Yes), 0 (which means No) or NULL (also
means NO).
If a DataDictionary row has the Enabled_ flag set to 1, the Core banking table
definition defined in the row will be taken into consideration during the Analytics
ETL process, otherwise, it will be ignored. Enabled_ is used to both exclude
redundant columns from being loaded into InsightWarehouse and to disable
obsolete definitions which should not be erased or overwritten
HasData Deprecated. No longer in use.
IssueTrackerId Deprecated. No longer in use.

Page 200 | 335


Advanced Analytics Platform Technical Guide

MappingCompleted Deprecated. No longer in use.


Verified Deprecated. No longer in use.
VerifiedBy Deprecated. No longer in use.
DisplayName Deprecated. No longer in use.
Added System field storing date and time in which the definition was added.
Do not manually enter data.
Modified System field storing date and time in which the definition was modified.
Do not manually enter data.
Deleted System field storing date and time in which the definition was deleted.
Do not manually enter data.

SQL Stored Procedures


dbo.s_DDCombinedRecordsUpdate
Description
This stored procedure is used to create the Combined Configuration records in the Data Dictionary. Since
there is the potential for having a column defined in multiple configurations this procedure will pick the
most appropriate data size and type and create the column using that definition to avoid the possibility of
data loss. Type 2 SCD will take precedence over Type 1. This definition is stored under the configuration
called Combined Configuration and is only applicable to table columns.

Steps
1. Delete existing Combined Configuration records from Data Dictionary
2. For all table columns, determine the data type and size that will properly store data fitting
all definitions.
3. Create a new Combined Configuration record
Inputs
No parameters are required for this stored procedure.

dbo.s_TableStructureFromDD_update
Description
This procedure uses the table definitions in the Data Dictionary Combined Configuration records to update
the InsightWarehouse tables to meet these definitions. It will not make any changes that will result in data
loss. It attempts to perform table alter statements to add columns or update data types and sizes.

Steps
1. Check if table column is new, if so then add it to the table
2. If the column exists, check for a type mismatch.
3. If type mismatch is discovered attempt to automatically update table column to new type
4. If the auto update fails return error stating a manual update may be needed or the definition
is wrong.
Inputs
Page 201 | 335
Advanced Analytics Platform Technical Guide

No parameters are required for this stored procedure.

Page 202 | 335


Advanced Analytics Platform Technical Guide

dbo.s_ViewStructureFromDD_update
Description
This procedure drops all data warehouse views and recreates them based on the view definitions in the
Data Dictionary table. It uses template views as the base for the new view. The template views contain all
of the necessary join logic but contain no columns.

Steps
1. Get list of view templates
2. Get the list of fields for the first template from the Data Dictionary
3. Create the view with all direct columns and calculated columns
4. If create view fails then drop all calculated columns, list those columns in print then create
a view with just direct columns.
5. If direct columns missing from tables then create a view without missing columns and print
column names to output.
Input
No parameters are required for this stored procedure.

dbo.s_DW_DataDelete_Dimension
Description
This is a maintenance procedure used to delete orphaned dimensions from the InsightWarehouse. It is
internally called by s_DW_DataDelete_Range.

dbo.s_DW_DataDelete_Fact
Description
This is a maintenance procedure used to delete fact and bridge table records from the InsightWarehouse.
It is internally called by s_DW_DataDelete_Range.

dbo.s_DW_DataDelete_Range
Description
This is a maintenance procedure used to purge records from the InsightWarehouse. It first purges all fact
and bridge table records with a business date that falls in the date range specified. Then it removes all
orphaned dimension records that are no longer needed after the fact removals.

Steps
1. Determine the first period to be removed from the warehouse excluding month end dates if
the parameter is set as such
2. Exec s_DW_DataDelete_Fact
a. Removes all fact and bridge records in the period
Page 203 | 335
Advanced Analytics Platform Technical Guide

3. Exec s_DW_DataDelete_Dimension
a. Removes all orphaned dimension records
4. Set the Show Data and Has Good Data flags in the Dim Date table to 0 for all removed dates
Inputs
 Delete Start Date – Earliest date in the range of business dates to be removed from
InsightWarehouse. Date format is the native format for the SQL server collation being used.
Example ‘2014-01-01’
 Delete End Date - Latest date in the range of business dates to be removed from
InsightWarehouse. Date format is the native format for the SQL server collation being used.
Example ‘2014-01-31’
 Exclude Month End – Flag to indicate if month end dates should be retained in InsightWarehouse
after the purge. Set to 1 yes to retain month-end dates. Defaulted to 1.

dbo.s_EmptyColumns_drop
Description
This procedure drops all empty non-framework columns from InsightWarehouse. Framework columns are
all columns that have Framework for their Configuration in the Data Dictionary.

Steps
1. Get list of all non-framework columns and check if empty
2. Drop all empty columns
Inputs
This stored procedure does not require any parameters.

dbo.s_BusinessDate_Enable
Description
This stored procedure is executed at the end of the core Analtycs ETL to enable the latest imported business
date in the InsightWarehouse database. As a result, newly imported data associated to this business data
will be available for reporting in the Database engine and in the Analytics Web Front End Application. Once
this stored procedure has run successfully, the HasGoodData flag will be set to 1 (i.e. Yes).

dbo.s_CreateDWHashObjects
Description
This procedure creates the hash calculating columns, indexes and triggers for dimension tables in
InsightWarehouse (but will not affect dimension tables with the same name in InsightStaging). Please note
that the deployment of the CLR function fn_MaxUnicodeSha1HashCLR is a pre-requisite for this stored
procedure to run successfully.

Page 204 | 335


Advanced Analytics Platform Technical Guide

dbo.s_DW_CheckDBIntegrity
Description
This procedure performs a database integrity check in InisghtWarehouse.

dbo.s_DW_IndexMaintenance
Description
This procedure rebuilds and reorganizes indexex in InsightWarehouse, applying the following rule:

 Reorganizes indexes if Avg Fragmentation is between 5% and 30%


 Rebuild indexes if Avg Fragmentation is greater than 30%.

dbo.s_DW_UpdateStatistics
Description
This stored procedure updates statistics on all InsightWarehouse tables.

dbo.s_FilterByDimSystemSecurityUser
Description
This stored procedure allows the content of tables and abstraction views in InsightWarehouse to be filtered
based on the row-level security settings defined for each individual user, if Data Access Security is
configured.

Configuration
Configuring Data Dictionary Table
The Data Dictionary table is used to configure the existing objects in the data warehouse. This includes
adding new columns to tables and views, modifying table columns to change data types or increase the
column size and also creating new calculated columns in views.

Adding New Columns


To add a column to InsightWarehouse you will need to add one or more entries into the Data Dictionary
table depending on where you want the new column to be available. Typically you will want to add the
new column to either the abstraction view later for exposure to end users or to the cubes through the cube
view layer.

When adding a new table column you will need to consider if the column belongs in a dimension table or
a fact table. This requires you to know how frequently the data in the column will change over time. The
general rule is that any column that changes once a month or more frequently should be added to the fact
table. This will help control dimension growth and keep it a reasonable level. If you are unsure it is best to
add the column to the fact table.

Page 205 | 335


Advanced Analytics Platform Technical Guide

If you are adding a new slowly changing column to a dimension table then you need to also decide what
type of slowly changing column type you want to use. The Advanced Analytics Platform supports two slowly
changing dimension column types, Type 1 and Type 2.

When a column is set to Type 1 then when the data in the column changes the old value is overwritten in
the existing record. This means that the old value is discarded and not available in history. If you look up
an old record it will refer to the new value. This type is used for columns that contain information that is
almost always updated due to the initial value being incorrect or there is no analytical value to retaining
the old value. Examples are Customer Birth Date or Customer Tax Identification Number. These should
only ever be updated because the original value was incorrect.

Type 2 slowly changing columns when updated will create a new record in the dimension table. This means
that facts prior to the update of the column will refer to the older value while records added after the
update will refer to the new value. This is done when you want to retain the historical value since it makes
analytical sense to have the older facts refer to the original value. An example would be Customer Address
fields.

If you are adding the new column to a role playing dimensions like a branch or employee table then you
may want to add additional records to bring this new column directly into one or more abstraction views
such as v_Account or v_Customer to avoid requiring end users to use an extra join to the role playing view
to utilize the new field.

For the column to be populated it must be populated in InsightStaging using logic in a source view or from
InsightETL. Please refer to the InsightStaging section of this guide for the detailed steps to accomplish this.

The following example will cover adding a new dimension table record to the Branch Dimension table. For
a fact table record, you would not add a value for the SCD Type column.

Steps
1. Add a new table record to the Data Dictionary table as shown in the table below (partial example). Add
the record to a client specific configuration so it’s obvious that it is a local development.

Configuration TableName ColumnName ColumnType SCDType SourceSystem


Local DimBranch ProfitCenter Varchar(50) 2 Core
2. Add a record for each view that you need to expose the column in. For our example, we will add records
to add the column to the branch view (the role playing view) and also the customer view in the
abstraction layer, as shown in the table below (partial example).
Adding record for the branch (role playing) view:

Configuration TableName ColumnName SourceSystem SourceTable SourceColumn


Local V_Branch ProfitCenter Direct DimBranch ProfitCenter

Adding a record for the customer view:

When adding a record to include a role playing dimension you will need to reference the join logic from the
view template. Open a view template to see all of the available join references that exist or review existing
role-playing column entries in the Data Dictionary table. All view templates are in the Template schema in
InsightWarehouse. The syntax for view joins is TableA>TableB>TableN etc.

Page 206 | 335


Advanced Analytics Platform Technical Guide

If multiple JOIN statements are made to the same table in the view then subsequent joins will use a
sequential number in parenthesis such as TableA>TableB(2)>TableN etc. This references the second join
made to TableB. This is used when you need multiple role playing references such as having multiple
account employees or multiple customer employees. Remember when joining to a dimension you are
joining a fact table to a dimension table so the first table in the source table column should always be a
fact table, as shown in the table below (partial example).

Configuration Table Column Name Source Source Table Source


Name System Column
Local V_Customer CustProfitCenter Direct FactCustomer>DimBranch ProfitCenter

3. Exec s_DDCombinedRecords_update – creates new combined configuration record for column


4. Exec s_TableStructureFromDD_update – creates new table column in DimBranch based on Combined
Configuration record.
5. Exec s_ViewStructureFromDD_update – creates new view records based on Data Dictionary definitions.
If the column does not exist in the warehouse table this will be referenced in the output and view
column will not be created.
6. Run ETL to populate column.

Modifying an Existing Table Column


Sometimes it’s necessary to modify an existing table column in InsightWarehouse. In most cases, this is
done to increase the size of the column but there are also rare cases when the data type needs to be
changed and this can be accommodated under certain circumstances. If it is possible to change a data type
without data loss or with a simple alteration, then this will be done automatically. Examples include going
from Int to BigInt or SmallDateTime to DateTime.

In the partial example, shown in the table below, we would already have a record in the data dictionary
for this column and we would be adding an additional table record for the column under a new local
configuration.

Existing Record:

Configuration TableName ColumnName ColumnType SCDType SourceSystem


Core DimBranch ProfitCenter Varchar(25) 2 Core

Steps
1. Add new record to the Data Dictionary table

Configuration TableName ColumnName ColumnType SCDType SourceSystem


Local DimBranch ProfitCenter Varchar(50) 2 Core

2. Exec s_DDCombinedRecordsUpdate – This will create a new Combined Configuration record for the
table column. It compares all column definitions in all configurations to find the best type and size. In
Page 207 | 335
Advanced Analytics Platform Technical Guide

this case, we’ve added a record with a larger size so it will create a new Combined Configuration record
with the larger size varchar(50) as the data type.
3. Exec s_TableStructureFromDD_update – This will attempt to actually modify the column with a table
alter statement based on the new Combined Configuration definition from the Data Dictionary table. It
will report success or failure in the output message of the stored procedure.

Creating a Calculated View Column


Calculated columns can be added to an abstraction view just by creating a new entry in the Data Dictionary
table. This may be beneficial if you need to create a calculation over time that is not possible to perform in
ETL.

The calculation must be based on a field that exists in the data warehouse either in a table or a view. Also,
the column you are using in the calculation must be included in one of the existing tables in the join logic
for the view template the view is based on.

For example, if I want to create a calculated column in the v_GL abstraction view I would need to view the
template join logic for v_GL. This is located in the InsightWarehouse.Template.v_GL view. Opening this
view will show me the join logic and what tables are available to use. The join logic for this view is shown
below.
from
DimGL left join
FactGL on
DimGL.GLId = FactGL.GLId left join
DimCustomer [factgl>DimCustomer] on
[factgl>DimCustomer].CustomerId = FactGL.CustomerId LEFT JOIN
DimBranch [factgl>DimBranch] on
FactGL.BranchId = [factgl>DimBranch].BranchId LEFT JOIN
DimBranch [factgl>DimBranch(2)] on
FactGL.BranchId2 = [factgl>DimBranch(2)].BranchId LEFT JOIN
DimEmployee [factgl>DimEmployee] on
FactGL.EmployeeId = [factgl>DimEmployee].EmployeeId LEFT JOIN
DimAccount [factgl>DimAccount] on
FactGL.AccountId = [factgl>DimAccount].AccountId
Here we can see the database references we will use for our calculation. We can reference DimGl or FactGL
directly but if we want to create a calculation using one of the fields from the second join to DimBranch,
for example, we would need to use [FactGl>DimBranch(2)].Columnname. For the majority of
calculated columns, you will be using the fact or dimension table from the object (ie Account or Customer)
and you can just reference this directly without needing to open the view template.

For the example, we will create a new column in the Customer abstraction view called CustIsRich which
will divide the customers into Rich and Poor based on the deposit balances. While this is better handled in
InsightETL it will serve as an example here to keep it simple.

When creating the calculation we will add the calculation like it is being used like it is being included in the
select statement for the view. We do not need to include the commas, Select or From arguments or column
aliases. The alias for the calculated column is the value in the Column Name column.

Steps

Page 208 | 335


Advanced Analytics Platform Technical Guide

1. Determine which fields are required for the calculation. For our example, we will use
FactCustomer.DepsBalance Since this is in the fact table that v_Customer uses we can reference the
column directly.
2. Create a new view record in the Data Dictionary

Configuration Table Column Source Transformation


Name Name System
Local V_Customer CustIsRich Calculated Case When DepsBalance > 500000 Then
‘Rich’ Else ‘Not Rich’ End
3. Exec s_ViewStructureFromDD_update

SQL Stored Procedures


A number of stored procedures are executed in the InsightWarehouse database both as part of the Analytics
ETL, i.e. the SQL Agent Job used to Extract-Transfor-Load data within the Advanced Analytics Platform,
and to carry out maintenance tasks within this database.

Analytics ETL Procedures


Two stored procedures are executed in the Insightwarehouse database, within two separated steps of
Analytics ETL.

Update Stats on SW
Sp_updatestats is a standard T-SQL stored procedure that runs UPDATE STATISTICS against all user-
defined and internal tables in InsightWarehouse. This stored procedure is executed within the Update Stats
on SW step of Analytics ETL.

Enabling Business Dates


A procedure is executed to to enable the latest business date as part of the Analytics ETL job as follows:

declare @date as smalldatetime = (select MAX(BusinessDate) from InsightStaging.dbo.sourceDate );

EXEC dbo.s_BusinessDate_Enable @Date=@date, @Reset=null,


@BatchNum=null

Maintenance Procedures
The store procedure InsightETL.dbo.s_ColumnStoreIndex_Defragmentation is executed right after data has
been loaded into InsightWarehouse Dimension and Fact columnstore index tables. It is executed as follows:

EXEC [dbo].[s_ColumnStoreIndex_Defragmentation]

@DatabaseName = N'InsightWarehouse',

@CompressRowGroupsWhenGT = 5000,

@MergeRowGroupsWhenLT = 150000,

@DeletedTotalRowPercentage = 10,

Page 209 | 335


Advanced Analytics Platform Technical Guide

@DeletedSegmentsRowPercentage = 20,

@EmptySegmentsAllowed = 0,

@ExecOrPrint = N'EXEC',

@BatchNum = null

Maintenance Agent Job


There is a maintenance agent job provided “InsightWarehouse CSI Defragmentation” that calls the
store procedure InsightETL.dbo.s_ColumnStoreIndex_Defragmentation. This job is provided if there is a
need to compress or merge row groups even further than the daily maintenance task run in the Analytics
ETL. To achieve ideal compressed rowgroup size different values would have to be passed to the store
procedure. Keep in mind that a large columnstore index reorganizing will initially require additional CPU
resources to compress the data, which could slow overall system performance. Thus, schedule the job to
run when impact to users will be minimal.

Page 210 | 335


Advanced Analytics Platform Technical Guide

InsightSystem
Overview
InsightSystem is exclusively used to store configuration tables for the Analytics Web Front End Application.
These configuration tables should be directly updated through the web application – please refer to the
Analytics Application Functional User Guide for more details about this. Furthermore, this database is not
involved in the ETL processing or in any other data processing or maintenance operation.

Specific Features / Functions


Feature Description
Storing Analytics Web Front End InsightSystem stores configuration tables and views for the
configuration Analytics web front end application. Configuration data in this
database will be accessed directly from the web application

Page 211 | 335


Advanced Analytics Platform Technical Guide

Budget Data
Overview
The advanced Analytics Platform has a significant amount of functionality and content for budget reporting
but it is important to highlight that Analytics is not a budgeting or forecasting tool. It has the flexibility to
handle the import and reporting of multiple budgets.

Budget data is imported in the Fact GL table and is stored as a separate row in a column. This is how
Analytics supports multiple budgets without the need to create new columns or new cube attributes. All
amounts are stored in the GLAmount column in Fact GL.

You can differentiate budget data from banking system data by the GLSourceSystem column. This will have
a value of BS for all banking system values and Budget or something similar for all budget values. The GL
abstraction view has logic to create aGLActualAMount and GLBudgetAmount columns to split the actual
and budget amounts column wise. These columns are used in the out of the box budget dashboards and
budget reports.

The standard process for importing budgets into Analytics is to enter budget data into an Excel spreadsheet
and use the data import functionality of SQL Server to add this data to a table in the Budget database.
Budget data is treated as a separate source system in Analytics.

Budget Data Granularity and Structure


Budget data is stored in Analytics at the general ledger level of granularity and all budgets should be
cumulative (year to date) for ideal use in Analytics out of the box budget reporting. Analytics does not
require that budgets be provided at the most detailed level but supports multiple levels of granularity.
Typically for the most flexibility and to accommodate chart of account changes it is recommended to create
budget data at the GL line level of granularity. Any less granular and significant changes to the chart of
accounts throughout the budget cycle may have negative impacts on how the data is displayed.

Budgets can be more granular than the GL Line level, for instance, a bank can budget for each GL line per
branch. Essentially any additional attributes already defined in the General Ledger view in
InsightWarehouse can be used to slice the budget data and support more granular budgets. Keep in mind
that even though this is possible, entering and maintaining budgets at a very fine level of granularity can
become quite onerous.

Even if not budgeting by branch it is required to assign all budget amounts to at least one default branch,
typically the admin branch. The minimum fields that are required for budget data are shown in a standard
budget table below. There would be an amount column for each month but in the example, we only show
from Jan to Mar.

GL Branch GL Line Num GL Line Desc Jan Feb Mar


Mnemonic
BNK 1234 Cash From Banks 120000 240000 380000
CO1 1234 Cash From Banks 300000 625000 950000

Page 212 | 335


Advanced Analytics Platform Technical Guide

Importing Budget Data into InsightLanding


Once the budget data has been properly entered into an Excel spreadsheet then it is ready for import. If
this is the first time importing budget data then you may need to create the Budget database. This is just
a standard database with no tables initially. If you already have a Budget database then you can skip the
first step.

1. Create Budget Database


Used for keeping budget data. Each budget should be imported as a separate table following the basic
structure defined in the data granularity and structure section.

The Budget database should have one table added manually titled SourceDate. This should have one
column titled BudgetBusinessDate. Add one record with the current budget date being imported, for
example, 2014-01-01 for the 2014 budget.

Budget Business Date


2014-01-01

2. Use SQL Server Import and Export Wizard to import data into Budget database
2.1. Set data source to Excel spreadsheet and point it to the Excel sheet picking the latest version of
Excel in the list if the installed version is not present. Indicate if the first row has column names.
2.2. Chose the Budget database as the destination
2.3. Chose the copy data from one or more tables or views option
2.4. Select the sheet that has the budget data. Rename the destination table as GLBudget or something
else descriptive. Click the preview button to make sure the data looks correct.
2.5. Select run immediately and Finish.
2.6. The package should run reporting success and the number of rows imported. Check to see if the
row count is correct. If package reports errors check permissions to Budget database.
3. Create new Extract List Record

Create a record in the InsightLanding Extract List table for the new Budget table. This should have its own
source name like Budget to indicate that this is data coming from a system other than the core banking
system

Source Source Source Source Import Import Import Where User Id


Name DB Schema Table Flag Fields Order Clause
Budget Budget Dbo GLBudget 1 * 1 * dbo

4. Create new Extract Source Date record

Create a record in the InsightLanding Extract Source Date table for the budget data. This record needs to
include a query that returns the source date for the budget data. For our example we will use the query:
Select @bsdate = max(budgetdate) from Budget.dbo.GLBudget.

Source Name BS Date SQL

Budget Select @bsdate = max(budgetdate) from Budget.dbo.SourceDate

Page 213 | 335


Advanced Analytics Platform Technical Guide

5. Exec s_InsightLanding_Update(‘Budget’,’2014-01-01’,’’dbo’,’No’)
Run the Landing update procedure to bring the budget data into Landing. If everything was configured
correctly for our example we would see two new tables 20140101Budget.GLBudget and
20140101Budget.SourceDate in InsightLanding.

6. Exec s_InsightSoure_Update(‘Budget’,’2014-01-01’,’No’)
Run this procedure in InsightSource to bring in the budget data so it is ready to be used in InsightStaging
for the next step.

Budget Source Views


Analytics comes with standard or template source views budgets, one for the budget data itself and the
other for the budget date. The budget data source view will transpose the budget data so budget amounts
will be in rows instead of columns. This source view assumes that GLSourceSystem will be set to ‘Budget’
and that the data will be at the GL Line level of granularity. You will need to add in the branch and any
additional GL fields you are budgeting by to the select statement. See below for a source view created from
our example which includes branch and is importing budgets daily. Remove the commented out line to
bring in budgets just for month end. You should not need to edit the budget date source view.

1. Modify Source Views


GL Budget Data Source View
Alter view [dbo].[v_sourceGLBudget] as

select

cast(CAST(HASHBYTES('SHA1', cast(d.BusinessDate as varchar) + ':' + line_No) as


bigint) as varchar(50)) as sourceGLId,
'Budget' as GLSourceSystem,
'BS:GLBranchMnemonic' as SourceBranchId,
b.Line_No as GLNum,
cast(GLAmount as money) as GLAmount,
--dateadd(day, -1, dateadd(month, MonthNumber, d.BudgetBusinessDate)) as
BusinessDate
BusinessDate
from (
select Line_No,
case Month_
when 'Jan' then 1
when 'Feb' then 2
when 'Mar' then 3
when 'Apr' then 4
when 'May' then 5
when 'June' then 6
when 'July' then 7
when 'Aug' then 8
when 'Sept' then 9
when 'Oct' then 10
when 'Nov' then 11
when 'Dec' then 12
end as MonthNumber,
GLAmount
from (
Page 214 | 335
Advanced Analytics Platform Technical Guide

select GlLineNum as Line_No, Jan, Feb, Mar, Apr, May, June, July, Aug, Sept,
Oct, Nov, [Dec],GLBranchMnemonic
from [$(InsightSource)].Budget.GLBudget)p
unpivot
(
GLAmount for Month_ in (Jan, Feb, Mar, Apr, May, June, July, Aug, Sept, Oct,
Nov, [Dec],GLBranchMnemonic)
) as unpvt
) b join
v_sourceDateBudget d on
b.MonthNumber = datepart(month, d.BusinessDate)

Budget Date Source View


Alter view [dbo].[v_sourceDateBudget] as

select
1 as sourceDateID, -- Do not use. for extract step only
max(sd.BusinessDate) as BusinessDate ,
MAX (bd.BudgetBusinessDate) as BudgetBusinessDate
from
[$(InsightMasterData)]..CurrentDate cd join
[$(InsightMasterData)].dbo.SourceDate sd on
cd.BusinessDate = sd.BusinessDate and
sd.SourceSystem = 'Budget' left join
[$(InsightSource)].Budget.sourceDate bd on
bd.BudgetBusinessDate = sd.SourceDate

2. Test Source Views

Select data from the source views and ensure that the correct number of rows are being returned.
Remember the budget data has been pivoted and since the view is joining the budget date view then it will
return only one month of data. There must be a record in InsightMasterData..SourceDate for the Budget
source system for the business date being processed (date in InsightMasterData..CurrentDate) for the
budget source view to return data.

3. Check Update Order and Systems Tables

Ensure that the InsightStaging Update Order table has records for the budget source system. If not you
will have to create two records for the extract and load steps for DimGL.

Update Configuration Table Source Table Action Enabled Excluded


Order Name System Type
10820101 ModelBank DimGL Budget Budget Extract.substep 1 0
10820104 ModelBank DimGL Budget Budget Add.substep 1 0

Check the Systems table and make sure there is a record for the Budget system and it is set to enabled.
4. Exec s_InsightStaging_Update(1,1)

Run the InsightStaging update procedure for a business date that should have budget data. This will load
the budget data into InsightWarehouse for final testing.

Page 215 | 335


Advanced Analytics Platform Technical Guide

Budget Data in InsightWarehouse


Budget data in InsightWarehouse is stored in the Fact GL table and ultimately exposed in the GL abreaction
view (v_GL). To confirm that budget data has been successfully imported query v_GL for distinct GL Source
Systems.

Select Distinct(GLSourceSystem) from v_GL where BusinessDate = {Business date processed in previous
step}

You should now see a source system that is defined in the budget source view and for the default
configuration that would be Budget. Finally, you can select all records from v_GL where the GL Source
System is equal to Budget to confirm that all budget records have been imported successfully.

The budget amount should be in the GLAmount column in v_GL and also in the GLBudgetAmount column.

Page 216 | 335


Advanced Analytics Platform Technical Guide

Analytics ETL Processing


Overview
Analytics ETL is processed by means of SQL stored procedures called by SQL agent jobs. The SQL Agent
job can be started from any failed steps if needed. Alternatively, Analytics ETL can be executed by calling
the SQL stored procedures directly with the same parameters as in the SQL Agent Job. Because of this, the
ETL process can be customized for different scenarios as needed, such as calling additional custom source
systems as part of the job.

Technical Details
Technical Components
InsightETL
dbo.s_MergeUpdateBatchStatus
This stored procedure, executed at the beginning of Analytics ETL, creates a new batch in the Batch table
and will perform in-place updates to the involved Batch records.
dbo.s_CreateColumnCalculations
This stored procedure reads the content of the AttributeCalculations table in InsightETL and, based on the
Split, Calculations and Datasets definitions, it creates new columns in the target tables. In Analytics ETL,
this stored procedure is directly executed twice, first on the tables imported in the InsightImport database
and then on the tables imported into InsightSource. Furthermore, this stored procedure is internally called
by the s_InsightStaging_Update stored procedure when Analytics ETL reaches its core steps in the
InsightStaging database.
dbo.s_CreateRuleGroup
This stored procedure applies business rules designed through the Rules Engine functionalities of
InsightETL on a specific database whose name is specified as input parameter. In Analytics ETL, this stored
procedure is directly executed three times, first on the tables imported in the InsightImport database, then
on the tables loaded in InsightImport and then on the tables imported into InsightSource. Furthermore,
this stored procedure is internally called by the s_InsightStaging_Update stored procedure when Analytics
ETL reaches its core steps in the InsightStaging database.
dbo.s_PopulateAuditCounts
This stored procedure performs a count of the records processed within each database involved in ETL
processing. It is executed several times during ETL for records count reconciliation between source and
target databased. The results of these calculations are stored in the dbo.TableRowCountAudit log table in
InsightETL.
dbo.s_ColumnStoreIndex_Defragmentation
Forces all of the rowgroups into the columnstore and then to combine the rowgroups into fewer rowgroups
with more rows. The ALTER INDEX REORGANIZE online operation also removes rows that have been
marked as deleted from the columnstore index tables. During Analytics ETL, this stored procedure is
executed twice for maintence reasons, first within the InsightLanding database and then in the
InsightWarehouse.

Example

EXEC [dbo].[s_ColumnStoreIndex_Defragmentation]

@DatabaseName = N'InsightLanding',
Page 217 | 335
Advanced Analytics Platform Technical Guide

@CompressRowGroupsWhenGT = 5000,

@MergeRowGroupsWhenLT = 150000,

@DeletedTotalRowPercentage = 10,

@DeletedSegmentsRowPercentage = 20,

@EmptySegmentsAllowed = 0,

@ExecOrPrint = N'EXEC',

@BatchNum = null

dbo.s_SetBatchStatusFinished
This stored procedure marks the completion of any batch executed, either ‘CompletedWithError’ if there
are error(s) or ‘CompletedSuccessfully’ when no error is encountered. It executed at the end of the Analytics
ETL agent job.
Furthrmore, InsightETL is used to set the current date through a SQL statement similar to the one below.

Example

declare @CurrentETLDate date = (select max(mis_date) from InsightImport.dbo.DATES);

UPDATE CurrentDate set BusinessDate = @CurrentETLDate;

InsightImport
Insight.s_ImportBaseTables
This procedure manages the control of the process to Bulk Insert from CSV files to SQL in parallel with DQI
(Data Quality Import) or/and DPI (Data Profiler Import).

Insight.s_ImportSubTables
This procedure parses multi-values, sub-values, and local reference fields creating the associated ‘sub’
tables.

Insight.s_T24ConsolKeys_add
This procedure parses Consolidation keys for GL-related tables like RE_CONSOL_SPEC_ENTRY and any CRF
file.

. Steps
1) Updates Batch log
EXEC [InsightETL].dbo.s_MergeUpdateBatchStatus null, null, null
2) Runs the Import.
Exec [Insight].[s_Import_Control] @PathName = 'E:\InsightImport', @TableName = 'All',
@ReCreateTables = 0, @TableType = 'Regular', @SystemTablesExist = 0, @BatchNum = null,
@TotalThreads = null;
3) Checks imported tables against HASH_TOTAL
Exec [InsightImport].[Insight].[s_ImportDataReportErrors] @TableName = 'All', @BatchNum =
null;
4) Performs Local reference fields parsing
Page 218 | 335
Advanced Analytics Platform Technical Guide

Exec [Insight].[s_Import_Control] @PathName = 'E:\InsightImport', @TableName = 'All',


@ReCreateTables = 0, @TableType = 'localref', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;
5) Performs Multi-value parsing
Exec [Insight].[s_Import_Control] @PathName = 'E:\InsightImport', @TableName = 'All',
@ReCreateTables = 0, @TableType = 'mv', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;
6) Performs Multi-value sub-value parsing
Exec [Insight].[s_Import_Control] @PathName = 'E:\InsightImport', @TableName = 'All',
@ReCreateTables = 0, @TableType = 'mvsv', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;
7) Performs sub-value parsing for local reference fields
Exec [Insight].[s_Import_Control] @PathName = 'E:\InsightImport', @TableName = 'All',
@ReCreateTables = 0, @TableType = 'lrsv', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;
8) Parses consolidation keys for GL
9) exec [Insight].[s_T24ConsolKeys_Add] @TableName = 'All', @BatchNum = null;

Insight.s_BuildCOA
This procedure builds the Chart of Accounts table.

InsightLanding
dbo.s_InsightLandingTable_CSI_Table_Update
Loads data into InsightLanding’s columnstore index tables.

Example

declare @CurrentETLDate date = (select max(BusinessDate) from InsightETL.dbo.CurrentDate);

EXEC dbo.s_InsightLanding_CSI_Table_Update @SourceName = 'BS', @BSDate = @CurrentETLDate,


@UserId = 'dbo', @CreateRowBaseIndex = 0, @DataCompression = 'COLUMNSTORE',
@UpdateTableSchema = 1, @BatchNum = null, @TotalThreads = null;

InsightSource
dbo.s_InsightSource_CSI_Update
Loads data from InsightLanding columnstore index tables into the InsightSource database.

Example

declare @CurrentETLDate date = (select max (BusinessDate) from InsightETL.dbo.CurrentDate);

Page 219 | 335


Advanced Analytics Platform Technical Guide

EXEC dbo.s_InsightSource_CSI_Update @sources = 'BS', @BSDate = @CurrentETLDate, @BatchNum =


null, @TotalThreads = null;

InsightStaging
dbo.s_InsightStaging_Update (Extract)
Extracts data into Staging Tables

Example

EXEC dbo.s_InsightStaging_Update @ExecuteExtractSteps = 1, @ExecuteTransformLoadSteps = 0,


@BatchNum = null, @TotalThreads = null;

dbo.s_InsightStaging_Update (Transform)
Transforms data into Dim and Facts and Loads InsightWarehouse

Example

EXEC dbo.s_InsightStaging_Update @ExecuteExtractSteps = 0, @ExecuteTransformLoadSteps = 1,


@BatchNum = null, @TotalThreads = null;

InsightWarehouse
sp_updatestats
Updates Analytics ETL statistics.

dbo.s_BusinessDate_Enable
Enables the business date.

Example

Declare @date as smalldatetime = (select MAX(BusinessDate) from InsightStaging.dbo.sourceDate );

EXEC dbo.s_BusinessDate_Enable @Date=@date, @Reset=null,


@BatchNum=null

SQL Agent Jobs


Analytics ETL Job
The Analytics ETL job runs the above-described stored procedures.

Analytics ETL consists of the following steps.

Step Description (SQL Run)


Create a new Batch ID for the current ETL EXEC [InsightETL].dbo.s_MergeUpdateBatchStatus null,
run null, null
Insight Base Tables Exec [InsightImport].[Insight].[s_ImportBaseTables]
@CsvDir = 'E:\InsightImport', @TableName = 'All',
@SystemTablesExist = 0, @BatchNum = null,
@TotalThreads = null;
Local Ref Parsing Exec [InsightImport].[Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'LocalRef', @BatchNum
= null, @TotalThreads = null;

Page 220 | 335


Advanced Analytics Platform Technical Guide

Multi-value Parsing Exec [InsightImport].[Insight].[s_ImportSubTables]


@TableName = 'All', @TableType = 'MV', @BatchNum =
null, @TotalThreads = null;
Multi-value Sub-value Parsing Exec [InsightImport].[Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'MVSV', @BatchNum =
null, @TotalThreads = null;
Local Ref Sub-value Parsing Exec [InsightImport]. [Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'LRSV', @BatchNum =
null, @TotalThreads = null;
Parsing Consol Keys Exec [InsightImport].[Insight].[s_T24ConsolKeys_Add]
@TableName = 'All', @BatchNum = null;
Build COA exec [InsightImport].[Insight].[s_BuildCOA]
Insight Attributes Calculations-Import exec s_CreateColumnCalculations
'InsightImport','All','dbo','All',1
exec s_CreateColumnCalculations
'InsightImport','All','dbo','All',2
exec s_CreateIndexes 'All','InsightImport'
Create InsightImport Rules declare @CurrentETLDate date =
(select max(mis_date) from InsightImport.dbo.DATES);
--InsightImport Rules
EXEC [InsightETL.[ [dbo].[s_CreateRuleGroup]
@DatabaseName = N'InsightImport',
@TableName = N'all',
@SchemaName = N'all', --not hooked up --
will be all
@ExecutionPhase = N'all',
@ExecutionStep = 0, --all steps
@BusinessDate = @CurrentETLDate,
@IsPersisted = 2,--all
@RuleDefinitionId = null,
@BatchNum = NULL,
@StagingEventLogId = null;
----------------------
Setup Current ETL Data [USE InsightETL]
declare @CurrentETLDate date = (select max(mis_date)
from InsightImport.dbo.DATES);
UPDATECurrentDate set BusinessDate = @CurrentETLDate;
InsightLanding CSI BS Update declare @CurrentETLDate date = (select max(BusinessDate)
from InsightETL.dbo.CurrentDate);

EXEC
[InsightLanding].dbo.s_InsightLanding_CSI_Table_Update
@SourceName = 'BS', @BSDate = @CurrentETLDate,
@UserId = 'dbo', @CreateRowBaseIndex = 0,
@DataCompression = 'COLUMNSTORE',

Page 221 | 335


Advanced Analytics Platform Technical Guide

@UpdateTableSchema = 1, @BatchNum = null,


@TotalThreads = null;
InsightLanding CSI Defragmentation EXEC
[InsightETL].[dbo].[s_ColumnStoreIndex_Defragmentation]
@DatabaseName = N'InsightLanding',
@CompressRowGroupsWhenGT = 5000,
@MergeRowGroupsWhenLT = 150000,
@DeletedTotalRowPercentage = 10,
@DeletedSegmentsRowPercentage = 20,
@EmptySegmentsAllowed = 0,
@ExecOrPrint = N'EXEC',
@BatchNum = null

Create InsightLanding Rules declare @CurrentETLDate date =


(select max(mis_date) from InsightImport.dbo.DATES);
--InsightImport Rules
EXEC [InsightETL.[ [dbo].[s_CreateRuleGroup]
@DatabaseName = N'InsightLanding',
@TableName = N'all',
@SchemaName = N'all', --not hooked up --
will be all
@ExecutionPhase = N'all',
@ExecutionStep = 0, --all steps
@BusinessDate = @CurrentETLDate,
@IsPersisted = 2,--all
@RuleDefinitionId = null,
@BatchNum = NULL,
@StagingEventLogId = null;
----------------------
Insight Source Update Declare @CurrentETLDate date = (select max
(BusinessDate) from InsightETL.dbo.CurrentDate);

EXEC [InsightSource].dbo.s_InsightSource_CSI_Update
@sources = 'BS', @BSDate = @CurrentETLDate,
@BatchNum = null, @TotalThreads = null;
Insight Attributes Calculations-Source [USE InsightETL]
exec s_CreateColumnCalculations
'InsightImport','All','dbo','All',1
exec s_CreateColumnCalculations
'InsightImport','All','dbo','All',2
exec s_CreateIndexes 'All','InsightSource'
Create InsightSource Rules declare @CurrentETLDate date =

Page 222 | 335


Advanced Analytics Platform Technical Guide

(select max(mis_date) from InsightImport.dbo.DATES);


--InsightImport Rules
EXEC [InsightETL.[ [dbo].[s_CreateRuleGroup]
@DatabaseName = N'InsightSource',
@TableName = N'all',
@SchemaName = N'all', --not hooked up --
will be all
@ExecutionPhase = N'all',
@ExecutionStep = 0, --all steps
@BusinessDate = @CurrentETLDate,
@IsPersisted = 2,--all
@RuleDefinitionId = null,
@BatchNum = NULL,
@StagingEventLogId = null;
----------------------
InsightStaging Update – Extract EXEC [InsightStaging].dbo.s_InsightStaging_Update
@ExecuteExtractSteps = 1, @ExecuteTransformLoadSteps
= 0, @BatchNum = null, @TotalThreads = null;
InsightStaging Update – Transform/ Load EXEC [InsightStaging].dbo.s_InsightStaging_Update
@ExecuteExtractSteps = 0, @ExecuteTransformLoadSteps
= 1, @BatchNum = null, @TotalThreads = null;
Update Stats on SW EXEC sp_updatestats
Enabling Business Date [USE InsightWarehouse]
declare @date as smalldatetime = (select
MAX(BusinessDate) from InsightStaging.dbo.sourceDate );
EXEC
dbo.s_BusinessDate_Enable @Date=@date, @Reset=null,
@BatchNum=null
Set Completion of the Current ETL Batch EXEC InsightETL.dbo.s_SetBatchStatusFinished;
InsightWarehuse CSI Defragmentation EXEC
[InsightETL].[dbo].[s_ColumnStoreIndex_Defragmentation]
@DatabaseName = N'InsightWarehouse',
@CompressRowGroupsWhenGT = 5000,
@MergeRowGroupsWhenLT = 150000,
@DeletedTotalRowPercentage = 10,
@DeletedSegmentsRowPercentage = 20,
@EmptySegmentsAllowed = 0,
@ExecOrPrint = N'EXEC',
@BatchNum = null

Process InsightWarehouseOLAP Database See Process Analytics Cubes chapter.


(only available if at least one Analytics
Content Pack is deployed)
Page 223 | 335
Advanced Analytics Platform Technical Guide

Refresh KPI Cache Declare @KPICacheRefresh Nvarchar(50)

SELECT @KPICacheRefresh = [Value]


FROM [InsightWarehouse].[dbo].[DimSystemParameters]
Where [Type] = 'Application' and [Name] = 'KPI Cache
Refresh' and Active = 1

IF @KPICacheRefresh = 'Enabled'
EXEC msdb.dbo.sp_start_job @job_name = 'KPI Cache
Maintenance'
In case the Current ETL Batch ended with EXEC InsightETL.dbo.s_SetBatchStatusFinished;
Error, Report and Exit

Process Analytics Cubes


If at least one Analytics Content Pack is deployed within the Advanced Analytics Platform, the Analytics ETL
agent job will contain a step to process all Multidimensional Cubes are processed using.

Example

Declare @BusinessDate date,

@CommandText Nvarchar(200),

@CubeProcessType Nvarchar(50)

SELECT @CubeProcessType = case when [Value] is null or [Value] = '' Then

'NoProcess' Else

[Value] End

FROM [InsightWarehouse].[dbo].[DimSystemParameters]

Where [Type] = 'Cube' and [Name] = 'ProcessType' and Active = 1

SELECT case when @CubeProcessType is null or @CubeProcessType = '' Then

'NoProcess' Else

[Value] End

FROM [InsightWarehouse].[dbo].[DimSystemParameters]

Where [Type] = 'Cube' and [Name] = 'ProcessType' and Active = 1

Page 224 | 335


Advanced Analytics Platform Technical Guide

--Print @CubeProcessType

--Print @BusinessDate

--Testing

--Set @CubeProcessType = 'ProcessFull'

--Set @CubeProcessType = 'NoProcess'

----------------------------------------

If @CubeProcessType = 'ProcessByPartition'

BEGIN

Print 'Process by Partition'

EXEC msdb..sp_start_job @job_name= 'Process Insight Cubes - By Partition', @step_name = 'Update


Process Date'

END

-----------------------------------

If @CubeProcessType = 'ProcessFull'

BEGIN

Print 'Process full'

EXEC msdb..sp_start_job @job_name= 'Process Insight Cubes - By Partition', @step_name = 'Full Process'

END

-----------------------------------

If @CubeProcessType = 'NoProcess'

BEGIN

Print 'No Process'

END

Scheduling Jobs
Agent jobs can be scheduled.

Double Click on the Agent Job and select Schedules.

Log File Viewer


The log file viewer can be used to view the history of job runs.
Page 225 | 335
Advanced Analytics Platform Technical Guide

In SQL Server Management Studio, right click on Jobs under SQL Server Agent, select “View History” from
the right click menu.

Figure 33 - Log File Viewer content sample

Page 226 | 335


Advanced Analytics Platform Technical Guide

Process Insight Cubes – By Partiton


Overview
This SQL agent job processes the OLAP Multidimensional Cubes by Partition, if at least one Analytics Content
Package is deployed within the Advanced Analytics Platform – only the latest partition is created and
processed while older partitions not taken in consideration. The pre-requisites for this job to be executed
successfully are that, firstly, the IM Framework SSAS Management project is published and that the
s_BusinessDate_enable has been successfully executed for the business date considered.

Page 227 | 335


Advanced Analytics Platform Technical Guide

Process Data ExStore


Overview
This SQL agent job consists of a partial Analytics ETL. Process Data ExStore carries out the import of the
latest Temenos Core Banking data up to InsightLanding.

SQL Agent Jobs


Process Data ExStore Job
The Process Data ExStore job consists of the following steps.

Step Description (SQL Run)


Create a new Batch ID for the current ETL EXEC [InsightETL].dbo.s_MergeUpdateBatchStatus null,
run null, null
Insight Base Tables Exec [InsightImport].[Insight].[s_ImportBaseTables]
@CsvDir = 'E:\InsightImport', @TableName = 'All',
@SystemTablesExist = 0, @BatchNum = null,
@TotalThreads = null;
Local Ref Parsing Exec [InsightImport].[Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'LocalRef', @BatchNum
= null, @TotalThreads = null;
Multi-value Parsing Exec [InsightImport].[Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'MV', @BatchNum =
null, @TotalThreads = null;
Multi-value Sub-value Parsing Exec [InsightImport].[Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'MVSV', @BatchNum =
null, @TotalThreads = null;
Local Ref Sub-value Parsing Exec [InsightImport]. [Insight].[s_ImportSubTables]
@TableName = 'All', @TableType = 'LRSV', @BatchNum =
null, @TotalThreads = null;
Parsing Consol Keys Exec [InsightImport].[Insight].[s_T24ConsolKeys_Add]
@TableName = 'All', @BatchNum = null;
Insight Attributes Calculations-Import exec s_CreateColumnCalculations
'InsightImport','All','dbo','All',1
exec s_CreateColumnCalculations
'InsightImport','All','dbo','All',2
exec s_CreateIndexes 'All','InsightImport'
Create InsightImport Rules declare @CurrentETLDate date =
(select max(mis_date) from InsightImport.dbo.DATES);
--InsightImport Rules
EXEC [InsightETL.[ [dbo].[s_CreateRuleGroup]
@DatabaseName = N'InsightImport',
@TableName = N'all',

Page 228 | 335


Advanced Analytics Platform Technical Guide

@SchemaName = N'all', --not hooked up --


will be all
@ExecutionPhase = N'all',
@ExecutionStep = 0, --all steps
@BusinessDate = @CurrentETLDate,
@IsPersisted = 2,--all
@RuleDefinitionId = null,
@BatchNum = NULL,
@StagingEventLogId = null;
----------------------
Setup Current ETL Data [USE InsightETL]
declare @CurrentETLDate date = (select max(mis_date)
from InsightImport.dbo.DATES);
UPDATECurrentDate set BusinessDate = @CurrentETLDate;
InsightLanding CSI BS Update declare @CurrentETLDate date = (select max(BusinessDate)
from InsightETL.dbo.CurrentDate);

EXEC
[InsightLanding].dbo.s_InsightLanding_CSI_Table_Update
@SourceName = 'BS', @BSDate = @CurrentETLDate,
@UserId = 'dbo', @CreateRowBaseIndex = 0,
@DataCompression = 'COLUMNSTORE',
@UpdateTableSchema = 1, @BatchNum = null,
@TotalThreads = null;
InsightLanding CSI Defragmentation EXEC
[InsightETL].[dbo].[s_ColumnStoreIndex_Defragmentation]
@DatabaseName = N'InsightLanding',
@CompressRowGroupsWhenGT = 5000,
@MergeRowGroupsWhenLT = 150000,
@DeletedTotalRowPercentage = 10,
@DeletedSegmentsRowPercentage = 20,
@EmptySegmentsAllowed = 0,
@ExecOrPrint = N'EXEC',
@BatchNum = null

Create InsightLanding Rules declare @CurrentETLDate date =


(select max(mis_date) from InsightImport.dbo.DATES);
--InsightImport Rules
EXEC [InsightETL.[ [dbo].[s_CreateRuleGroup]
@DatabaseName = N'InsightLanding',
@TableName = N'all',
@SchemaName = N'all', --not hooked up --
will be all

Page 229 | 335


Advanced Analytics Platform Technical Guide

@ExecutionPhase = N'all',
@ExecutionStep = 0, --all steps
@BusinessDate = @CurrentETLDate,
@IsPersisted = 2,--all
@RuleDefinitionId = null,
@BatchNum = NULL,
@StagingEventLogId = null;
----------------------
Set Completion of the Current ETL Batch EXEC InsightETL.dbo.s_SetBatchStatusFinished;
In case the Current ETL Batch ended with EXEC InsightETL.dbo.s_SetBatchStatusFinished;
Error, Report and Exit

Page 230 | 335


Advanced Analytics Platform Technical Guide

DW Online Processing
Overview
The DW Online Processing was introduced in R19 to execute online processing. This job has a recurring
schedule with a frequency interval of every five minutes. This parameter is configurable in
SystemParametersLanding rule.

Steps
This job consists of the following steps.

Step Description (SQL Run)


Setup Current Online Date IF EXISTS (select max(NEXT_BUSINESS_DATE) from
InsightImport.dbo.DATES)
BEGIN
DECLARE @CurrentETLDate date = (select
max(NEXT_BUSINESS_DATE) from
InsightImport.dbo.DATES);

UPDATE CurrentDate set OnlineBusinessDate =


@CurrentETLDate
WHERE OnlineBusinessDate is null;
END
Process Import Online EXEC [Online].s_ProcessImportOnline_Update
@SourceName = N'BS'

Page 231 | 335


Advanced Analytics Platform Technical Guide

Run Reports Subscriptions


Overview
From Release 2017, the Run Subscriptions SQL agent job was introduced to create soft copies of Quick and
Custom reports for which end-users have set up a subscription (for more detailed information about Quick
Reports, Custom Reports and Subscriptions in the Analytics Web Front End, please refer to the Analytics
R17 Front End User Guide). The Run Subscriptions agent job consists of three steps.

Steps
1. This step runs the InsightSystem..s_GetKPIAlerts stored procedure. This stored procedure check
for active Report Subscriptions in the InsightSystem..KPIDefinitions table and populates the
KPINoneAlerts table with the list of KPI ids to be processed. The latter table is then used by a view
that drives Step 2
2. The second step runs the Power Shell script that generates the report files and saves them in the
designated folder that was configured in the System Settings screen of the Analytics Web Front
End.
3. The third step updates the ApplicationLogs table with a confirmation that the required files were
created successfully

Page 232 | 335


Advanced Analytics Platform Technical Guide

InsightLanding CSI Purge


Overview
This job purges data in InsightLanding based on retention policy defined in a global parameter in
SystemParametersLanding and a value at the table level in ExtractList (PurgeOlderThan). This global
parameter was added to the SystemParametersLanding rule as follows:

 Value = 84
 Type = LandingHistoryCSI
 Name = Retention Period in Months

Steps
This job includes only one step that executes the s_InsightLanding_CSI_Purge stored procedure.

Page 233 | 335


Advanced Analytics Platform Technical Guide

InsightLanding CSI Defragmentation


Overview
This job compresses open rowgroups indexes and combines multiple compressed rowgroups indexes into
one, removing deleted rows. Defragmentation makes sure that all the entries in an index are contiguous
for faster and more efficient access instead of being spread out across multiple disks and partitions

Steps
This job consists of the following steps.

Step Description (SQL Run)


Create a new Batch ID for the Maintenance EXEC dbo.s_MergeUpdateBatchStatus null, null, null
run
CSI Defragmentation EXEC [dbo].[s_ColumnStoreIndex_Defragmentation]
@DatabaseName = N'InsightLanding',
@CompressRowGroupsWhenGT = 5000,
@MergeRowGroupsWhenLT = 150000,
@DeletedTotalRowPercentage = 10,
@DeletedSegmentsRowPercentage = 20,
@EmptySegmentsAllowed = 0,
@ExecOrPrint = N'EXEC',
@BatchNum = null
SET Completion of the current ETL Batch EXEC dbo.s_SetBatchStatusFinished;
In Case the Current ETL Batch Ended with EXEC dbo.s_SetBatchStatusFinished;
Error, Report and Exit

Page 234 | 335


Advanced Analytics Platform Technical Guide

InsightWarehouse CSI Defragmentation


Overview
This job compresses open rowgroups indexes and combines multiple compressed rowgroups indexes into
one, removing deleted rows. Defragmentation makes sure that all the entries in an index are contiguous
for faster and more efficient access instead of being spread out across multiple disks and partitions

Steps
This job consists of the following steps.

Step Description (SQL Run)


Create a new Batch ID for the Maintenance EXEC dbo.s_MergeUpdateBatchStatus null, null, null
run
CSI Defragmentation EXEC [dbo].[s_ColumnStoreIndex_Defragmentation]
@DatabaseName = N'InsightWarehouse',
@CompressRowGroupsWhenGT = 5000,
@MergeRowGroupsWhenLT = 150000,
@DeletedTotalRowPercentage = 10,
@DeletedSegmentsRowPercentage = 20,
@EmptySegmentsAllowed = 0,
@ExecOrPrint = N'EXEC',
@BatchNum = null
SET Completion of the current ETL Batch EXEC dbo.s_SetBatchStatusFinished;
In Case the Current ETL Batch Ended with EXEC dbo.s_SetBatchStatusFinished;
Error, Report and Exit

Page 235 | 335


Advanced Analytics Platform Technical Guide

InsightETL Maintenance
Overview
This agent job purges the content of the Log tables within the InsightETL database (i.e. EventLog,
EventLogDetails, StagingEventLog and StagingEventLogDetails) within a certain date range.

Steps
This job only includes one job (i.e. Purge Old Data) that purges InsightETL logs by executing the
s_InsightETL_RangePurge stored procedure. The date range to purge is determined through a parameter
stored in the InsightETL.dbo.v_SystemParameters view.

Page 236 | 335


Advanced Analytics Platform Technical Guide

KPI Cache Maintenance


Overview
This job sets the status of the KPI Cache to “expired” and recalculates the expired values.

Steps
This job consists of the following steps.

 Expire KPI Cache: uses T-SQL code to set the Expiry parameter of the KPI Cache to the current
date
 Recalculate KPI Cache: uses T-SQL code to recalculate the value of KPI Cache

Page 237 | 335


Advanced Analytics Platform Technical Guide

Multi-Company (Temenos Core banking Specific)


This section covers the multi-company Insight configuration for Core banking. While the multi-company
structure in Insight can support any core, the default mapping provided by the Core banking model bank
layer is designed specifically for Core banking multi-company implementations.

The approach used was to minimize the changes required in the InsightStaging source view mapping layer
so the mapping is robust enough to support the majority of multi-company configurations in Core banking.
The design approach is covered in detail so technical users may modify or extend the multi-company
components when they find the mapping does not suit their particular multi-company configuration.

Core Banking Multi-Company Data


Possible Core Banking multi-company scenarios are listed here.

Core Banking Set- Extract Extract Type Shared? Note In Scope


up

Single Company/ Single Extract NA Single Y


Single Company Multi- Company
book Implementati
on

Multi-company Single Extract Merge Company Customer Y


Extract
Currency

Extended Multi- Single Extract Merge Company Customer Y


company Extract
Currency

All above Multiple Extract Multiple extracts All above Only Single N
Extract is
currently
supported

Multi-Company Joins in Analytics

Core banking File Types


The following table type needs to be joined in v_source views.

Table Type Description

INT - Installation There is only one copy of installation level tables in a Core Banking database so the
actual table name does not include a company mnemonic. Examples of INT level tables

Page 238 | 335


Advanced Analytics Platform Technical Guide

include COMPANY, USER, VERSION and ENQUIRY

CUS - Customer The CUSTOMER and related tables, often include the customer number in the key.
Examples include CUSTOMER.DEFAULT and LIMIT. This type of table can be shared
between lead companies.

CST - Customer table Parameter tables related to customer examples include INDUSTRY and SECTOR,
together with parameter tables for limits, collateral and position management. This type
of table can be shared between lead companies.

FIN – Financial There will always be one copy of a financial table per Lead Company. All branches linked
to a Lead Company will share the same financial table. Lead Company’s do not share
FIN tables. Examples include ACCOUNT, FUNDS.TRANSFER and TELLER

FRP - Financial Reporting There will be one copy of a financial table for every company in the system, examples
include the COB job list tables and REPGEN work files like RGP.TRANS.JOURNAL2

FTD - Financial Table Financial parameter tables that do not contain data with amounts or rates linked to a
Descriptive particular local currency. Examples include the AA.PRODUCT type tables and
BASIC.INTEREST. This type of table can be shared between lead companies.
FTD tables can be shared between Companies, the Id of the owner Company is specified
in the DEFAULT.FINAN.COM Field
If a Company owns only a few FTD type files, they are specified in SPCL.FIN.FILE to
SPCL.FIN.MNE Fields.

FTF - Financial Table Financial parameter tables that contain financial data, often contain local currency
Financial amounts.
Examples include GROUP.DEBIT.INT, GENERAL.CHARGE and TAX.
This type of table can be shared between lead companies.

CCY - Currency Tables containing currency-related information examples include CURRENCY and
PERIODIC.INTEREST. This type of table can be shared between lead companies.

NOS - Nostro Tables related to NOSTRO accounts. Examples include AGENCY and
NOSTRO.ACCOUNT. This type of table can be shared between lead companies.

An example of a required join is a join from the Account (FIN) table to the Customer (CUS) table. We will
also need to create a new column defined as follows

<Source System> :<Master Company Mnemonica>:< Account Company Code>

Page 239 | 335


Advanced Analytics Platform Technical Guide

Account Table – Never shared between companies

LEAD_CO_MNE BRANCH_CO_MNE MIS_DATE @ID CUSTOMER ACCOUNT_OFFICER


BNK BNK 2014-08- EUR199610001 NULL 1
12
BCH BCH 2014-08- 120000000097 100101 42
12
BNK BNK 2014-08- 110000002698 630001 3
12

Customer Table – Sharing To be determined

LEAD_CO_MNE BRANCH_CO_MNE MIS_DATE @ID


BNK BNK 2014-08-12 100100
BNK BNK 2014-08-12 100101
BNK BNK 2014-08-12 100103

Clearly, it is not possible to join the Account table to the Customer table without figuring out some important
pieces of information

1) Which hierarchical relationship we have between companies within installations.


2) If the Customer table is shared in the current installation or not.
3) What the Customer Company associated with Branch_Co_Mne BCH is.
4) What the MASTER company in the installation is.
The company table and the company check tables (below) are used to determine this.

The company check table will help us understand the relationship between different companies in our
installation and what the MASTER company is. As we can see, in this table, the company marked as MASTER
(i.e. first lead company) is LU0010001. LU0019003 is a branch dependent on LU0010001 and CH0010002
is another lead company. We know that CH0010002 is a lead company because it has its own set of financial
files. As we can see from the company code and company mnemonic assigned to the entry with @ID =
CUSTOMER, the customer files are shared and using the master company’s mnemonic and company code.

Company Check Table – Customer files shared between lead companies

LEAD BRAN MIS_DATE @ID COMPANY_C COMPANY_M USING_COM USING_MNE FILE_KEY


_CO_ C_CO ODE NE
MNE _MNE
BNK BNK 2014-08-12 MASTER LU0010001 BNK

BNK BNK 2014-08-12 SUB.DIVI 0001.9003. NULL LU0010001. BNK.BCH.NI


SION 0002 LU0019003 I

Page 240 | 335


Advanced Analytics Platform Technical Guide

BNK BNK 2014-08-12 CUSTOM LU0010001 BNK LU0019003 BCH.NII


ER
BNK BNK 2014-08-12 FINANCI LU0010001 BNK LU0019003 BCH.NII
AL
BNK BNK 2014-08-12 CURREN LU0010001. BNK. BCH LU0019003 NII
CY CH0010002
BNK BNK 2014-08-12 NOSTRO LU0010001 BNK LU0019003 BCH.NII

BNK BNK 2014-08-12 FIN.FILE LU0010001. BNK. BCH LU0019003 NII


CH0010002
BNK BNK 2014-08-12 CCY.FILE USD.CHF.GBP NULL LU0010001. BNK.BCH.NI
.EUR.SGD.US LU0019003 I
D

This information is also confirmed by the company table – the company with mnemonic BCH has, in fact,
its own FINANCIAL_MNE but shares the CUSTOMER_MNEMONIC with BNK.

Company Table – Customer files shared between lead companies

@ID MNE CUSTOMER_ FINANC CURRENCY_ DEFAULT_F SPCL_CUST FINAN_FI


MON MNEMONIC IAL_MN MNEMONIC INAN_MNE _MNE NAN_MNE
IC E
CH00 BCH BNK BCH BNK BNK NULL BNK
1000
2
LU00 BNK BNK BNK BNK BNK NULL BNK
1000
1
LU00 NII BNK BNK BNK BNK NULL BNK
1900
3

We see that we can join on Company.Customer_Mnemonic.

So the SQL join would be:


Select a.LEAD_CO_MNE as Account_Branch_Co_Mne, a.CUSTOMER, a.ONLINE_ACTUAL_BAL,
a.[@id] Account_id, c.lead_co_mne as Customer_Lead_Co_Mne,
com.mnemonic as Company_Mmnemonic, com.Customer_Mnemonic as
Company_Customer_Mnemonic, A.Customer as Customer_ID,
concat('BS:' , CO_CHK.COMPANY_MNE , ':' , CAST(A.[CO_CODE] as nvarchar (50))) as
[SourceBranchId]

Page 241 | 335


Advanced Analytics Platform Technical Guide

FROM [InsightSource].[BS].[ACCOUNT] a
LEFT JOIN [InsightSource].[BS].[COMPANY] COM ON COM.MNEMONIC =
A.BRANCH_CO_MNE
LEFT JOIN [InsightSource].[BS].[COMPANY_CHECK_Company_Mne] CO_CHK ON
CO_CHK.[@ID] = 'MASTER' and CO_CHK.[Sequence] = 1
LEFT JOIN [InsightSource].[BS].[CUSTOMER] c ON COM.CUSTOMER_MNEMONIC =
c.LEAD_CO_MNE and a.CUSTOMER = c.[@ID]

Resulting in:

Account_B CUST ONLINE_A Accou Customer_ Company_ Company Custo SourceBra


ranch_Co_ OMER CTUAL_BA nt_id Lead_Co_M Mmnemon _Custome mer_I nchId
Mne L ne ic r_Mnemo D
nic
BCH 10010 -57776.19 12000 BNK BCH BNK 100101 BS:BNK:
1 00000 CH0010002
22
BCH 10010 NULL 12000 BNK BCH BNK 100101 BS:BNK:
1 00000 CH0010002
38
BCH 10010 -107624.27 12000 BNK BCH BNK 100101 BS:BNK:
1 00000 CH0010002
49

JOIN features
The general procedure to determine what the join should be includes the following steps:

1. Determine the type of table of the core (the first table referenced in the From Clause of a
v_source view) table. In the case of v_sourceAccountBSAA_Accounts it is AA_ARRANGEMENT, by
referring to the FILE_CONTROL table we determine that AA_ARRANGEMENT is a FIN table.
2. Then by referring to Table 1 and the column [Used to Join Table Type], we look for FIN and find
that FIN relates to Financial_mne.
3. Therefore the join from the Company table to the Joined to (ACCOUNT) table would be ON
COM.FINANCIAL_MNE = A.LEAD_CO_MNE and AA. LINKED_APPL_ID = A.[@ID]
4. Use the Company_check table to retrieve information about the MASTER
company
5. The first part of the join is usually the same.
6. Select * FROM [InsightSource].[BS].[AA_ARRANGEMENT] aa
LEFT JOIN
[InsightSource].[BS].[COMPANY] COM ON COM.MNEMONIC = AA.BRANCH_CO_MNE,
the first table always links on BRANCH_CO_MNE.
7. The Complete join would be
Select *
Page 242 | 335
Advanced Analytics Platform Technical Guide

FROM [InsightSource].[BS].[AA_ARRANGEMENT] aa
LEFT JOIN
[InsightSource].[BS].[COMPANY] COM ON COM.MNEMONIC = AA.BRANCH_CO_MNE
LEFT JOIN
[InsightSource].[BS].[ACCOUNT] a ON a.LEAD_CO_MNE = COM.FINANCIAL_MNE
AND A.[@ID] = aa.LINKED_APPL_ID The other table always links on
LEAD_CO_MNE.
8. Any subsequent joins can (generally) we joined on the Company table
which has already been joined to the first table in the v_source view.

Core banking Company Metadata


The company table is used to determine the Branch/Company hierarchy, as well as to serve as an
intermediary table so that tables of different file types can be joined.
Table 1 – Company Metadata

Field Description Used to join to table Detail


type

@id The most granular level of


company and or branch, the
bottom of the hierarchy

Mnemonic The mnemonic of the @id


record

Financial_com The parent of the @id record

Financial_mne The mnemonic of FIN, FRP


Financial_com

Customer_company

Customer_mnemonic CUS

Currency_company

Currency_mnemonic CUR

SPCL_FIN_FILE If a Company owns only a few FTD = Case When


FTD type files, they can be SPCL_FIN_FILE is
specified in SPCL.FIN.FILE to null THEN
SPCL.FIN.MNE Fields Default_Finan_Mne
ELSE
SPCL_FIN_MNE END

DEFAULT_FINAN_MNE INT

SPCL_FIN_MNE

Page 243 | 335


Advanced Analytics Platform Technical Guide

Company Type The logic here determines case when Financial_com


which records are lead = customer_company then
'Master'
companies, master company or
branch when [@id] =
financial_com then
'lead'

else 'Branch' end

Default_Com Used to identify the company


record that will provide the data
to default values

into the new company record


being created

SPCL_CUST_FILE CST = CASE WHEN


COM.SPCL_CUST_FI
LE IS NULL THEN
COM.DEFAULT_CUS
T_MNE ELSE COM
SPCL_CUST_MNE
END

DEFAULT_CUST_MNE

SPCL_CUST_MNE

Finan_Finan_Mne FTF

Table 2 – Company_Check Metadata

Field Description Used to join to table Detail


type

@id This field describes Used to filter LEFT JOIN


relationships between tables MASTER company in [InsightSource].[BS].[
an efficient manner COMPANY_CHECK_Company_
Mne] CO_CHK ON
CO_CHK.[@ID] = 'MASTER'
and CO_CHK.[Sequence] =
1

The company that owns the


COMPANY_CODE related data. Multi-valued.

Page 244 | 335


Advanced Analytics Platform Technical Guide

The company mnemonic if the


company that owns the related
COMPANY_MNE data.multi-valued

The key to the company check


record.Will be one of the
following ( see overview)
CCY.FILE CURRENCY
CUSTOMER FIN.FILE
FINANCIAL MASTER
FILE_KEY NOSTRO SUB.DIVISION

The company that shares the


data owned by the company
set up in the associated
COMPANY.CODESub valued
within the multi-valued
USING_COM COMPANY.CODE set

The company mnemonic of the


company that shares the data
owned by the company set up
in the associated
COMPANY.CODESub valued
within the multi-valued
USING_MNE COMPANY.CODE set

Primary and Foreign Natural Keys


It should be ensured that all foreign keys will match to primary keys. For example, the method used to
calculate the SourceBranchID (Foreign Natural Key) in v_SourceAccountBS should be the same as that
used to calculate SourceBranchID (Primary Natural Key) in v_SourceBranchBS. So if the SourceBranchID is
Company/ Branch Mnemonic + Company Code, Branch should be the lowest level of granularity in the
company table, eg. 'BS:' + COM.MNEMONIC + ':' + CAST(A.[CO_CODE] as varchar) as
[SourceBranchId].

Page 245 | 335


Advanced Analytics Platform Technical Guide

Multi-Currency (Temenos Core banking Specific)


Multi-currency functionality satisfies the requirement to convert Balances in a multi-company multi-currency
environment to the currency of the Master Company. Presentation Currency amounts are created for four
fact tables.

V_Source views
V_SourceCurrencyBS

This v_source view populates Fact and Dim Currency.

Field Name Description


BusinessDate
SourceCurrencyID The business key for Currency, eg. concat('BS',
':', c.Lead_CO_MNE , ':' ,
isnull(left(c.[@ID],3),''),
isnull(co.Local_Currency,''), ':',
cast(c.CURRENCY_MARKET as nvarchar(4))) AS
SourceCurrencyID
,concat('BS:' , CO_CHK.COMPANY_MNE
,':' + C.[CO_CODE]) as [SourceBranchId]
Where C is the alias of CURRENCY, Co is the alias of
Currency_Company and CO_CHK is the alias of
Company_Check table
SourceBranchId
CurrencyCodeFrom The currency being converted from
CurrencyCodeTo The currency being converted to
BuyRate
SellRate
MidRevalRate
RevalRate
MultiplierRate
RateDescription
NegotiableAmount
CurrencyMarketCode The code for a particular rate since there can be various
rates for a particular currency cross.
WAvgMidRevalRate The rate from a particular Currency record which has the
currency market code captured into the Source System for
use in conversion to Presentation Currency.
WAvgRevalRate

Page 246 | 335


Advanced Analytics Platform Technical Guide

V_SourceBranchBS
Fields in this view are used to populate DimBranch as well as to help set the SourceCurrencyID by executing
Dataset update statements defined in InsightETL.dbo.RuleDefinitions. See Master Data below.

Field Name Description


SourceBranchId
LeadCompany
BranchName
BranchNum
BranchAddress
Region
CompanyType
CompanyName
CurrencyCompany For a branch of a company, the Lead Company from which
the currency is inherited.
CurrencyMnemonic The mnemonic of the lead company above.
LocalCurrency The currency of the particular company.
PresentationCurrency The currency of the Master Company

InsightETL
Dataset business rules defined in InsightETL are used to set the SourceCurrencyID of the fact tables for
which the conversion to Presentation currency is done.

There are update statements for:

 FactGL
 FactAccount
 FactGLTran
 FactAcctTran

For example:

Update a

SET

SourceCurrencyID = ba.LeadCompany +':'+ba.localcurrency+ba.Presentationcurrency+':'+

isnull(S.Value,1)

Page 247 | 335


Advanced Analytics Platform Technical Guide

From

stagingGL a

Left Join StagingBranch ba

on a.sourcebranchid = ba.sourcebranchid

--The derived table below gets the Correct currency rate for the record, either the straight rate (code 1) or
the Weighted Average Rate (eg. Code 99).

Left Join (Select * from SystemParameters where [Type] = 'CurrencyMarketRate') S

on S.Name = a.GLInsightAttribute1

Data Model

The following tables have a CurrencyID foreign key.

 FactAccount
 FactGL
 FactAcctTran
 FactGLTran

This allows these tables to link to the DimCurrency to get the applicable Currency Cross code eg. USDGBP
to convert from the Lead Company’s currency to the Master Company’s currency.

The DimCurrency table is then linked to the FactCurrency table to get the applicable rate.

FactGLTran DimCurrency

PK BusinessDate
FactAcctTran PK CurrencyID
PK AccountID
PK BusinessDate
FactGL SourceCurrencyID
PK AccountID
CurrencyID
PK BusinessDateFactAccount
PK AccountID
CurrencyID PK BusinessDate FactCurrency
PK
CurrencyID AccountID
PK BusinessDate
FK1 CurrencyID PK,FK1 CurrencyID

Rates

Page 248 | 335


Advanced Analytics Platform Technical Guide

Reporting Views

The calculation of the Presentation Balance/Amounts is done in the Reporting views, v_Account, v_GL,
v_AcctTran, v_GLTran. The presentation balance calculation is configured in
InsightWarehouse..DataDictionary.

For example

SELECT

[DimAccount].AccountId as AccountId

, [FactAccount].BranchId as AcctBranchId

, [FactAccount].Balance as Balance

-----

, ([FactAccount].ForeignCurrencyBal * [DimCurrency>FactCurrency].MultiplierRate) as
PresentationBalance

FROM

FactAccount left JOIN

(Other tables Here)

left join DimCurrency [FactAccount>DimCurrency]

on

FactAccount.CurrencyId = [FactAccount>DimCurrency].CurrencyId

left join FactCurrency [DimCurrency>FactCurrency]

on

[FactAccount>DimCurrency].[CurrencyId] = [DimCurrency>FactCurrency].CurrencyId

AND FactAccount.BusinessDate = [DimCurrency>FactCurrency].BusinessDate

Page 249 | 335


Advanced Analytics Platform Technical Guide

Multi-Tenant Deployment
The Advanced Analytics platform can be setup in a multi-tenant environment. In R18, we do not have a
specific installer for multi-tenant deployment but there is a SQL package used for this purpose in which a
specific variable or flag will be set if we decide to avail of the multi-tenant option.

When the platform is initially installed with this option enabled, at least two tenants will be made available
by default, i.e. Tenant0 and Tenant1, both stored in the InsightSystem.dbo.Tenants table. The former will
store a user in charge of tenant administration while the latter can normally be used for business-as-usual
Analytics and BI16.

Tenant assignment to databases


A unique tenant identifier will be assigned at momement of the implementation. This identifier will be
appended to the name of the Insight databases. Eg. Tenant ABC will use Insight_ABC, InsightLanding_ABC,
InsightSource_ABC etc. The tenant will be appended when the database is published via the
TargetDataBaseName.

When external applications need to access the current database (e.g. Team Foundation Server), a number
of publish variables will be used to reference the correct tenant database. Eg. $(InsightSource) variable
will be replace with InsightSource_ABC during publish. It is important to remember that all code must use
variables instead of hardcoded database names.

Data values
Tables such as AttributeCalculations, CustomColumn and UpdateOrder may have hardcoded database
names. These will be changed in the post-deploy script by replacing the database names.

Scheduled Jobs
Separate scheduled jobs will be created for each tenant. Eg. Analytics ETL (ABC),
InsightLandingMaintenance (ABC) etc. Each step will point to the appropriate tenant database. This is
accomplished with a $(tenant) parameter in the Post Deploy script of TFS.

User Roles
Users’ roles are customized for the installation. This configuration can either consist of creating Server
Roles for each tenant or using Active Directory groups.

Publishing
The publishing process simply consists of the execution of an ad hoc powershell script. This manual publish
from Team Foundation Server requires all the $(database) parameters to be populated with the proper
tenant suffix. For publishing from a build, we should use the standard Framework/ModelBank build. A
new InsightPublishParameters.xml will include a Tenant parameter. The new PowerShell script
InsightPublishDatabaseTenant.ps1 will use the tenant parameter to change the target database names.
For the local layer, it is assumed the database name parameters in TFS will be set for each tenant in
separate local layers.

16
The Tenants table is thereafter managed through the Tenant Management option of the System Menu
in the Analytics Application front end. Please refer to the Analytics Web Front End User Guide for more
detailed information about this functionality.

Page 250 | 335


Advanced Analytics Platform Technical Guide

Cubes
During publish a script will be created to rename the SSAS database name in the xml file. Connection
strings in TemenosSSASFunctions also must be changed to use the proper tenant suffix database.

Reports
As for cubes, we should change any report data sources to point to the appropriate database name

Application
Finally, we should change any application settings to point to the appropriate database name.

Page 251 | 335


Advanced Analytics Platform Technical Guide

General Ledger
There are five objects (that include tables and a number of abstraction views) in InsightWarehouse that
are dedicated to storing General Ledger-, Profit & Loss- and Budget-related data: GL, GLConsolidated,
GLTran, GLAdjmt and Employee.

Dimensions of the GL, PL and Budget entries are hosted in the DimGL table while measures reside in
FactGL. Furthermore, GL transactions are stored in the FactGLTran table, FactGLAdjmt contains GL
Adjustments data and DimEmployee stores the information of the Department or of the Department
Account Officer (i.e. the Employee) managing a particular GL entry. This table stores, in general,
information about the hierarchical structure of a financial institution and will be used to identify the
employee or department in charge of specific accounts and customers in the Advanced Analytics Platform.
Finally, DimGLConsolidated and FactGLConsolidated contain dimensions and facts about all Chart of
Accounts consolidated entries, including those that do not have balance information.

Page 252 | 335


Advanced Analytics Platform Technical Guide

Data Model

As we can see from the diagram below, FactGL is linked to DimGL and DimEmployee. FactGL and
FactGLTran are linked to FactGLAdjmt, while the GLConsolidated tables are connected to one another and
also to FactGLTran, FactGL, FactGLAdjmt and DimEmployee.

FactGL DimGL DimEmployee


GLConsolidatedId GLId EmployeeId
GLId SourceGLId SourceEmployeeId
BusinessDate Active BranchID
BranchId Added Active
BranchId2 Modified Added
AccountId UpdateReason Modified
CustomerId DeletedSCD2 UpdateReason
EmployeeId GLAccountNum DeletedSCD2
EmployeeId2 GLAssetType Department
CurrencyID GLBSAttribute1 Division
CurrencyID2 GLBSAttribute10 EmplFullName
CurrencyID3 GLBSAttribute2 EmplLastName
OrganisationID GLBSAttribute3 EmployeeNum
PortfolioID GLBSAttribute4 EmployeePosition
ProductID GLBSAttribute5 EmployeeStatus
SystemSecurityId GLBSAttribute6 EmployeeType
GLAmount GLBSAttribute7 EmplStartDate
GLForeignAmount GLBSAttribute8 NetworkUserName
GLBSAttribute9 ParentLevel1
GLCurrency ParentLevel1Code
GLDescription ParentLevel2
GLInsightAttribute1 ParentLevel2Code

FactGLTran FactGLAdjmt FactGLConsolidated


DimGLConsolidated
GLTranId GLAdjmtId GLConsolidatedId
GLConsolidatedId
BusinessDate GLConsolidatedId BusinessDate
SourceGLConsolidatedId
AccountId GLId BranchId
Active
ActivityId EmployeeId BranchId2
Added
BranchId SystemSecurityId CurrencyID
Modified
CardId CurrencyID CurrencyID2
UpdateReason
CurrencyID CurrencyID2 CurrencyID3
DeletedSCD2
CurrencyID2 BusinessDate SystemSecurityId
GLAccountNum
CurrencyID3 GLAccountingDate OrganisationId
GLBSAttribute1
CustomerId GLAdjmtAmount GLAmount
GLBSAttribute10
EmployeeId GLAdjmtForeignAmt GLAmountMTD
GLBSAttribute2
EmployeeId2 GLAdjmtForeignCcy GLCredits
GLBSAttribute3
EmployeeId3 GLAdjmtPostedDate GLDebits
GLBSAttribute4
EventId SourceGLAdjmtId GLForeignAmount
GLBSAttribute5
GLConsolidatedId SourceGLTranId GLForeignAmountMTD
GLBSAttribute6
GLId GLBSAttribute7 GLForeignCredits

OrganisationId GLBSAttribute8 GLForeignDebits

ProductId GLBSAttribute9 GLForeignInitialAmount

PortfolioId GLCurrency GLInitialAmount

SecurityPositionId GLDescription GLMTDAvgAmount

ThirdPartyId GLInsightAttribute1 GLMTDAvgForeignAmount

ThirdPartyId2 GLInsightAttribute10

Figure 34 - General Ledger Data Model

R19 Model Updates


There are no remarkable changes to the GL data model in R19 that remains the same as in R18.

GL Adjustments
This section covers the GL Adjustments functionality of Analytics and the Core banking specific configuration
required.

Page 253 | 335


Advanced Analytics Platform Technical Guide

As we can see in
FactGL DimGL DimEmployee
GLConsolidatedId GLId EmployeeId
GLId SourceGLId SourceEmployeeId
BusinessDate Active BranchID
BranchId Added Active
BranchId2 Modified Added
AccountId UpdateReason Modified
CustomerId DeletedSCD2 UpdateReason
EmployeeId GLAccountNum DeletedSCD2
EmployeeId2 GLAssetType Department
CurrencyID GLBSAttribute1 Division
CurrencyID2 GLBSAttribute10 EmplFullName
CurrencyID3 GLBSAttribute2 EmplLastName
OrganisationID GLBSAttribute3 EmployeeNum
PortfolioID GLBSAttribute4 EmployeePosition
ProductID GLBSAttribute5 EmployeeStatus
SystemSecurityId GLBSAttribute6 EmployeeType
GLAmount GLBSAttribute7 EmplStartDate
GLForeignAmount GLBSAttribute8 NetworkUserName
GLBSAttribute9 ParentLevel1
GLCurrency ParentLevel1Code
GLDescription ParentLevel2
GLInsightAttribute1 ParentLevel2Code

FactGLTran FactGLAdjmt FactGLConsolidated


DimGLConsolidated
GLTranId GLAdjmtId GLConsolidatedId
GLConsolidatedId
BusinessDate GLConsolidatedId BusinessDate
SourceGLConsolidatedId
AccountId GLId BranchId
Active
ActivityId EmployeeId BranchId2
Added
BranchId SystemSecurityId CurrencyID
Modified
CardId CurrencyID CurrencyID2
UpdateReason
CurrencyID CurrencyID2 CurrencyID3
DeletedSCD2
CurrencyID2 BusinessDate SystemSecurityId
GLAccountNum
CurrencyID3 GLAccountingDate OrganisationId
GLBSAttribute1
CustomerId GLAdjmtAmount GLAmount
GLBSAttribute10
EmployeeId GLAdjmtForeignAmt GLAmountMTD
GLBSAttribute2
EmployeeId2 GLAdjmtForeignCcy GLCredits
GLBSAttribute3
EmployeeId3 GLAdjmtPostedDate GLDebits
GLBSAttribute4
EventId SourceGLAdjmtId GLForeignAmount
GLBSAttribute5
GLConsolidatedId SourceGLTranId GLForeignAmountMTD
GLBSAttribute6
GLId GLBSAttribute7 GLForeignCredits

OrganisationId GLBSAttribute8 GLForeignDebits

ProductId GLBSAttribute9 GLForeignInitialAmount

PortfolioId GLCurrency GLInitialAmount

SecurityPositionId GLDescription GLMTDAvgAmount

ThirdPartyId GLInsightAttribute1 GLMTDAvgForeignAmount

ThirdPartyId2 GLInsightAttribute10

Figure 34, an adjustment must have a corresponding GL Fact. It will be linked to dimGL (always linked to
a GL Account). It will be linked to dimEmployee, to track which employee made this adjustment.

Multiple rows are created in FactGLAdjmt, between Business Date and Posted Date. For example, if an
adjustment is posted on Friday, meant to have taken effect on the previous Monday, then four rows will
be created in FactGLAdjmt, with Business Dates of Monday through Thursday.

Adjustments made in Core Banking will have a corresponding row in FactGLTran. The source of FactGLTran
and FactGLAdjmt is the same; the Adjustments are a subset of the GL Transactions, where Business Date
is earlier than Posted Date.
Should GL be extracted from a Third Party source system, local customization can be implemented to fit
this data into InsightWarehouse.

GL Adjustment components of Analytics databases

Functional Area Description


InsightImport GL Adjustments are identified within the following tables:
BNK_RE_CONSOL_SPEC_ENTRY, BNK_STMT_ENTRY, BNK_CATEG_ENTRY,
PC_CATEG_ADJUSTMENT and PC_STMT_ADJUSTMENT
InsightLanding Similarly to InsightImport, GL Adjustment data resides
BNK_RE_CONSOL_SPEC_ENTRY, BNK_STMT_ENTRY, BNK_CATEG_ENTRY,
PC_CATEG_ADJUSTMENT and PC_STMT_ADJUSTMENT

Page 254 | 335


Advanced Analytics Platform Technical Guide

InsightETL No changes.

InsightSource For default data, the table dbo.GLAdjmt will hold GL Adjustments entered directly to
Insight, duplicated for all applicable dates.

InsightStaging Source views are used to populate FactGLAdjmt:


v_sourceGLAdjmtBSSpec, v_sourceGLAdjmtBSStmt, v_sourceGLAdjmtBSCateg
GL Adjustment related stored procedures are:
s_FactGLAdjmt_Extract, s_FactGLAdjmt_Transform, and s_FactGLAdjmt_Load.
InsightWarehouse GL Adjustment related data resides in the FactGLAdjmt table;
v_GL contains data coming from DimGL, FactGL and also FactGLAdjmt – v_GL entries
will present amount and adjusted amount columns i.e. (List of columns taken from
TRG docs).

The Core banking tables extracted for GL adjustment are:

RE_CONSOL_SPEC_ENTRY

STMT_ENTRY

CATEG_ENTRY

PC_CATEG_ADJUSTMENT

PC_STMT_ADJUSTMENT

The first three tables above contain each and every transaction coming for the current date from Core
banking while the last two tables prefixed with PC only contain records describing adjustment information.
Therefore, the method we use to identify adjustments is to reconcile the former three tables with the latter
two tables.

DW Export
Adjustments will only come from the Source System (Temenos Core banking), in the following DW Export
tables:

RE_CONSOL_SPEC_ENTRY

STMT_ENTRY

CATEG_ENTRY

PC_CATEG_ADJUSTMENT

PC_STMT_ADJUSTMENT

The GL Adjustment entries in these tables have already been made into multiple rows, as described above.

Page 255 | 335


Advanced Analytics Platform Technical Guide

InsightImport
Post-Closing GL Adjustments, the only adjustment currently coded in the view sources, are contained in
five existing tables which are extracted from Temenos Core banking by DW Export:

RE_CONSOL_SPEC_ENTRY

STMT_ENTRY

CATEG_ENTRY

PC_CATEG_ADJUSTMENT

PC_STMT_ADJUSTMENT

For all these tables, GL Adjustments have SYSTEM_ID = ‘DC’, and VALUE_DATE < BOOKING_DATE.

There is a Core Banking module called Data Capture, where adjustments are manually added. This is why
we filter on ‘DC’. The Value Date represents the effective date, the date to which adjustments are to be
back-dated. The Booking Date represents the date the adjustment was entered into Data Capture.

There are three ways of identifying adjustments: to begin with, we can join the STMT_ENTRY table with
the PC_STMT_ADJUSTMENT – if the id of an STMT_ENTRY record is also found in PC_STMT_ADJUSTMENT,
this means that said record should be considered as an adjustment. Secondly, a JOIN can be established
between CATEG_ENTRY and PC_CATEG_ADJUSTMENT – if the id of a record found in the former is also
found on the latter, the record is an adjustment. Thirdly, we search for ids of records coming from
RE_CONSOL_SPEC_ENTRY in either PC_CATEG_ADJUSTMENT or PC_STMT_ADJUSTMENT – if an id coming
from the former matches with an entry in any of the latter tables, then we have found an adjustment.

InsightLanding and InsightSource


No changes are applied to these databases. The above tables are simply carried forward to InsightSource.

InsightStaging
Views
A number of v_source views are used to populate FactGLAdjmt:

v_sourceGLAdjmtBSSpec, v_sourceGLAdjmtBSStmt, v_sourceGLAdjmtBSCateg

The views create multiple rows for a single adjustment. For example, if an adjustment was entered into
Data Capture on Friday, to be effective on the previous Monday, then four rows would be created, with
business dates of Monday, Tuesday, Wednesday, and Thursday, all with a Posted Date of Friday.

Stored Procedures
Three stored procedures are used specifically for GL Adjustments, there are similar to other
Extract/Transform/ Load stored procedures for other Transaction Facts.

 s_FactGLAdjmt_Extract

Page 256 | 335


Advanced Analytics Platform Technical Guide

 s_FactGLAdjmt_Transform:
 s_FactGLAdjmt_Load

InsightWarehouse
Views
The final calculation of the adjustment is done in InsightWarehouse views like v_GLConsolidated and v_GL.

For example:

ALTER view [dbo].[v_GLConsolidated] as


select
[FactGLConsolidated].BusinessDate as BusinessDate
, [FactGLConsolidated].CurrencyId as CurrencyId
, [FactGLConsolidated].CurrencyId2 as CurrencyId2
, [FactGLConsolidated].CurrencyId3 as CurrencyId3
, [FactGLConsolidated].BranchId as GLBranchId
, [DimGLConsolidated].GLConsolidatedId as GLConsolidatedId
, [FactGLConsolidated].GLAmountMTD as GLAmountMTD
, [FactGLConsolidated].GLForeignAmountMTD as GLForeignAmountMTD
, [FactGLConsolidated].GLMTDAvgAmount as GLMTDAvgAmount
, [FactGLConsolidated].GLMTDAvgForeignAmount as GLMTDAvgForeignAmount
, [FactGLConsolidated>DimBranch].LeadCompany as Company
, [DimGLConsolidated].SourceGLConsolidatedId as SourceGLConsolidatedId
, [FactGLConsolidated>DimBranch].BranchName as GLBranchName
, [FactGLConsolidated>DimBranch].BranchNum as GLBranchNum
, [DimGLConsolidated].GLBSAttribute1 as GLBSAttribute1
, [DimGLConsolidated].GLBSAttribute10 as GLBSAttribute10
, [DimGLConsolidated].GLBSAttribute2 as GLBSAttribute2
, [DimGLConsolidated].GLBSAttribute3 as GLBSAttribute3
, [DimGLConsolidated].GLBSAttribute4 as GLBSAttribute4
, [DimGLConsolidated].GLBSAttribute5 as GLBSAttribute5
, [DimGLConsolidated].GLBSAttribute6 as GLBSAttribute6
, [DimGLConsolidated].GLBSAttribute7 as GLBSAttribute7
, [DimGLConsolidated].GLBSAttribute8 as GLBSAttribute8
, [DimGLConsolidated].GLBSAttribute9 as GLBSAttribute9
, [DimGLConsolidated].GLCurrency as GLCurrency
, [DimGLConsolidated].GLDescription as GLDescription
, [DimGLConsolidated].GLAccountNum as GLAccountNum
, [DimGLConsolidated].GLInsightAttribute1 as GLInsightAttribute1
, [DimGLConsolidated].GLInsightAttribute10 as GLInsightAttribute10
, [DimGLConsolidated].GLInsightAttribute2 as GLInsightAttribute2
, [DimGLConsolidated].GLInsightAttribute3 as GLInsightAttribute3
, [DimGLConsolidated].GLInsightAttribute4 as GLInsightAttribute4
, [DimGLConsolidated].GLInsightAttribute5 as GLInsightAttribute5
, [DimGLConsolidated].GLInsightAttribute6 as GLInsightAttribute6
, [DimGLConsolidated].GLInsightAttribute7 as GLInsightAttribute7
, [DimGLConsolidated].GLInsightAttribute8 as GLInsightAttribute8
, [DimGLConsolidated].GLInsightAttribute9 as GLInsightAttribute9
, [DimGLConsolidated].GLNum as GLNum
, [FactGLConsolidated>DimBranch].Region as GLRegion
, [DimGLConsolidated].GLSourceSystem as GLSourceSystem
, [DimGLConsolidated].GLThirdPartyAttribute1 as GLThirdPartyAttribute1
, [DimGLConsolidated].GLThirdPartyAttribute10 as GLThirdPartyAttribute10
, [DimGLConsolidated].GLThirdPartyAttribute2 as GLThirdPartyAttribute2

Page 257 | 335


Advanced Analytics Platform Technical Guide

, [DimGLConsolidated].GLThirdPartyAttribute3 as GLThirdPartyAttribute3
, [DimGLConsolidated].GLThirdPartyAttribute4 as GLThirdPartyAttribute4
, [DimGLConsolidated].GLThirdPartyAttribute5 as GLThirdPartyAttribute5
, [DimGLConsolidated].GLThirdPartyAttribute6 as GLThirdPartyAttribute6
, [DimGLConsolidated].GLThirdPartyAttribute7 as GLThirdPartyAttribute7
, [DimGLConsolidated].GLThirdPartyAttribute8 as GLThirdPartyAttribute8
, [DimGLConsolidated].GLThirdPartyAttribute9 as GLThirdPartyAttribute9
, [DimGLConsolidated].SourceApplication as SourceApplication
, [FactGLConsolidated].GLAmount as GLUnAdjmtAmt
, [FactGLConsolidated].GLForeignAmount as GLUnAdjmtForeignAmt
, [FactGLConsolidated].SystemSecurityId as SystemSecurityId
, case when DimGLConsolidated.GLSourceSystem = 'T24' then
ISNULL([FactGLConsolidated].GLAmount,0) + isnull([FactGLConsolidated>Adjmt].GLAdjAmtSum,0)
end as GLActualAmount
, isnull([FactGLConsolidated>Adjmt].GLAdjAmtSum,0) as GLAdjmtAmount
, isnull([FactGLConsolidated>Adjmt].GLAdjForeignAmtSum,0) as GLAdjmtForeignAmt
, [FactGLConsolidated].GLAmount + isnull([FactGLConsolidated>Adjmt].GLAdjAmtSum,0)
as GLAmount
, case when DimGLConsolidated.GLSourceSystem = 'Budget' then
ISNULL([FactGLConsolidated].GLAmount,0) + isnull([FactGLConsolidated>Adjmt].GLAdjAmtSum,0)
end as GLBudgetAmount
, [FactGLConsolidated].GLCredits + isnull([FactGLConsolidated>Adjmt].GLCredits,0)
as GLCredits
, [FactGLConsolidated].GLDebits + isnull([FactGLConsolidated>Adjmt].GLDebits,0) as
GLDebits
, [FactGLConsolidated].GLForeignAmount +
isnull([FactGLConsolidated>Adjmt].GLAdjForeignAmtSum,0) as GLForeignAmount
, [FactGLConsolidated].GLForeignCredits +
isnull([FactGLConsolidated>Adjmt].GLForeignCredits,0) as GLForeignCredits
, [FactGLConsolidated].GLForeignDebits +
isnull([FactGLConsolidated>Adjmt].GLForeignDebits,0) as GLForeignDebits
, [FactGLConsolidated].GLForeignInitialAmount +
isnull([FactGLConsolidatedLastBusinessDate>Adjmt].GLAdjForeignAmtSum,0) as
GLForeignInitialAmount
, [FactGLConsolidated].GLInitialAmount +
isnull([FactGLConsolidatedLastBusinessDate>Adjmt].GLAdjAmtSum,0) as GLInitialAmount

from
FactGLConsolidated
left join
DimGLConsolidated on
DimGLConsolidated.GLConsolidatedId = FactGLConsolidated.GLConsolidatedId
left join
DimBranch [factglConsolidated>DimBranch] on
FactGLConsolidated.BranchId = [factglConsolidated>DimBranch].BranchId LEFT
JOIN
(select GLConsolidatedId, BusinessDate,
sum(GLAdjmtAmount) as GLAdjAmtSum,
sum(GLAdjmtForeignAmt) as GLAdjForeignAmtSum,
max(GLAdjmtPostedDate) as AdjPostedDateMax,
ISNULL(sum(case when GLAdjmtAmount < 0 and GLAccountingDate = BusinessDate then
ABS(GLAdjmtAmount) end),0) as GLDebits,
ISNULL(sum(case when GLAdjmtForeignAmt < 0 and GLAccountingDate = BusinessDate then
ABS(GLAdjmtForeignAmt) end),0) as GLForeignDebits,
ISNULL(sum(case when GLAdjmtAmount > 0 and GLAccountingDate = BusinessDate then
ABS(GLAdjmtAmount) end),0) as GLCredits,

Page 258 | 335


Advanced Analytics Platform Technical Guide

ISNULL(sum(case when GLAdjmtForeignAmt > 0 and GLAccountingDate = BusinessDate then


ABS(GLAdjmtForeignAmt) end),0) as GLForeignCredits
from [InsightWarehouse].dbo.FactGLAdjmt group by GLConsolidatedId, BusinessDate)
[FactGLConsolidated>Adjmt] on
[FactGLConsolidated>Adjmt].GLConsolidatedId =
FactGLConsolidated.GLConsolidatedId and [FactGLConsolidated>Adjmt].BusinessDate =
FactGLConsolidated.BusinessDate

left join (select BusinessDate, BSLastBusinessDate from [InsightWarehouse].dbo.DimDate)


Dates on Dates.BusinessDate = FactGLConsolidated.BusinessDate

left join
(select GLConsolidatedId, BusinessDate,
sum(GLAdjmtAmount) as GLAdjAmtSum,
sum(GLAdjmtForeignAmt) as GLAdjForeignAmtSum
from [InsightWarehouse].dbo.FactGLAdjmt group by GLConsolidatedId, BusinessDate)
[FactGLConsolidatedLastBusinessDate>Adjmt] on
[FactGLConsolidatedLastBusinessDate>Adjmt].GLConsolidatedId =
FactGLConsolidated.GLConsolidatedId and
[FactGLConsolidatedLastBusinessDate>Adjmt].BusinessDate = Dates.BSLastBusinessDate

left join DimCurrency [FactGLConsolidated>DimCurrency] on


FactGLConsolidated.CurrencyId = [FactGLConsolidated>DimCurrency].CurrencyId
left join FactCurrency [DimCurrency>FactCurrency] on
[FactGLConsolidated>DimCurrency].[CurrencyId] =
[DimCurrency>FactCurrency].CurrencyId
AND FactGLConsolidated.BusinessDate =
[DimCurrency>FactCurrency].BusinessDate

left join DimCurrency [FactGLConsolidated>DimCurrency2] on


FactGLConsolidated.CurrencyId2 =
[FactGLConsolidated>DimCurrency2].CurrencyId
left join FactCurrency [DimCurrency>FactCurrency2] on
[FactGLConsolidated>DimCurrency2].[CurrencyId] =
[DimCurrency>FactCurrency2].CurrencyId
AND FactGLConsolidated.BusinessDate =
[DimCurrency>FactCurrency2].BusinessDate

left join DimCurrency [FactGLConsolidated>DimCurrency3] on


FactGLConsolidated.CurrencyId3 =
[FactGLConsolidated>DimCurrency3].CurrencyId
left join FactCurrency [DimCurrency>FactCurrency3] on
[FactGLConsolidated>DimCurrency3].[CurrencyId] =
[DimCurrency>FactCurrency3].CurrencyId
AND FactGLConsolidated.BusinessDate =
[DimCurrency>FactCurrency3].BusinessDate

--Inner Join

--dbo.v_SysSecVisibleRecords on FactGL.SystemSecurityId =
v_SysSecVisibleRecords.SystemSecurityId
GO

This output below shows a sample result from v_GLConsolidated, including the GL Amount:

Page 259 | 335


Advanced Analytics Platform Technical Guide

Figure 35 - Output of the v_GLConsolidated view with Adjustment values (partial)

For all existing reports, GL Adjustments are automatically included in all amounts shown.

GL Adjustments are added to the corresponding GL Amount for that day and reported as GL Amount.

One of the columns on the left of v_GLConsolidated is “GLUnAdjmtAmt”, or “Unadjusted Amount”.

This is the original GL amount for the day, before any adjustments.This amount is not normally shown on
any reports.

Reports
All Financial Analytics reports provided Out-of-the-box will, by default, include the adjustments in any GL
Amounts displayed. The v_GLConsolidated view, which supplies GL data to the reports, will include the
adjustments in the existing GLAmount and GLForeignAmount fields. The adjustments will be grouped by
GL Account and Business Date and totaled.

Since the columns, GLAmount and GLForeignAmount include compound values, the fields below are crucial
to understanding the breakdown of individual components of the aforementioned figures

Field Name Description


GLAdjmtAmount Adjustment amount
GLAdjmtForeignAmt Adjustment amount in foreign currency
GLUnAdjmtAmt GL Amount without adjustments
GLUnAdjmtForeignAmt GL Amount without adjustments, in foreign currency
GLAdjmtPostedDate Most recent date of adjustments for this GL Account and
Business Date.

V_GLConsolidated would supply the data for General Ledger, Income Statement and Balance Sheet reports
and dashboards like the one below.

Page 260 | 335


Advanced Analytics Platform Technical Guide

Figure 36 - Balance Sheet by Branch report

GL Consolidated
The GLConsolidated object consist of a Dim and a Fact tables that contains records for all the Chart of
Accounts (i.e. COA) entries, including the ones without balance.

InsightStaging
Views
A number of v_source views are used to populate the DimGLConsolidated and FactGLConsolidated tables,
e.g.:

v_sourceGLConsolidatedBSGL, v_sourceGLConsolidatedBSPL, v_sourceGLConsolidatedBudget

Stored Procedures
Six stored procedures are used specifically for GL Consolidated, there are similar to other
Extract/Transform/ Load stored procedures for other Transaction Facts.

 s_DimGLConsolidated_Extract
 s_ DimGLConsolidated _Transform:
 s_ DimGLConsolidated _Load
 s_FactGLConsolidated_Extract
 s_ FactGLConsolidated _Transform:
 s_ FactGLConsolidated _Load

Page 261 | 335


Advanced Analytics Platform Technical Guide

InsightWarehouse
Tables
As previously discussed, the two new tables added to InsightWarehouse to store GLConsolitation data are
DimGLConsolitation and FactGLConsolidation and they will inherit a number of columns that, in earlier
releases, belonged to the GL object i.e. all Banking System, Analytics and Third Party Attributes and GL
Number-related columns. The GLConsolidated tables also store the aggregated balances from CRF by
Company, Branch, GL Number and Currency and GL Adjustments at the GL Number level.

In addition to this, FactGLConsolidation includes the following calculated columns:

 GLInitialAmount, GLForeignInitialAmount (set up through InsightETL.dbo.AttributeCalculations)


 GLAmountMTD, GLForeignAmountMTD (set up through InsightETL.dbo.AttributeCalculations)
 GLMTDAvgAmount, GLMTDAvgForeignAmount (set up through
InsightETL.dbo.AttributeCalculations)
 GLDebits, GLForeignDebits (respectively defined by the GLConsolidated-Debits and
GLConsolidated-Credits Dataset Rules in the InsightETL Rule Engine)
 GLCredits, GLForeignCredits (respectively defined by GLConsolidated-Debits and GLConsolidated-
Credits Dataset Rules in the InsightETL Rule Engine)

Views
The new GLConsolidated object involved the creation of three new abstraction views in the
InsightWarehouse database i.e. v_GLConsolidated, v_CubeDimGLConsolidated and
v_CubeFactGLConsolidated. The first of these views will expose the content of the GLConsolidated object
to the report layers while the second and the third one will allow it to be published to the GLConsolidated
SSAS Cube.

This output below shows a sample result from v_GLConsolidated.

Figure 37 - Output of the v_GLConsolidated view (partial)

Page 262 | 335


Advanced Analytics Platform Technical Guide

Cubes
As previously mentioned, a new Cube called GLConsolidated has also been created as part of the new
GLConsolidated object. This cube will automatically get installed as part of the Financial Analytics Content
Package.

Retail Analytics Contents and Relationships


Some of the out-of-the-box reports and dashboards of the Financial Analytics content are now relying on
the GLConsolidated object. It is worth observing that the GL relationship definition that pointed to the v_GL
abstraction view in previous releases is now using the v_GLConsolidated abstraction view as a base table.
The contents affected by this change are the following:

Quick Reports
The following Quick reports use GLConsolidated as their main source of data.

 Income Statement - Net by Currency


 Income Statement - Net
 Income Statement - Dr/Cr/Net by Currency
 Income Statement - Net/Budget/Var by Branch
 Income Statement - Budget by Branch
 Balance Sheet with Currency
 Income Statement - Dr/Cr/Net
 Balance Sheet Consolidated
 Balance Sheet Matrix by Currency
 Income Statement Consolidated
 Balance Sheet by Branch
 Income Statement - Net/Budget/Var
 Income Statement - Budget
 Income Statement by Branch
 Income Statement Matrix by Currency
 Balance Sheet Budget Analysis
 Income Statement Budget Analysis
 Income Statement Yearly Budget Analysis
 Balance Sheet - Dr/Cr/Net by Currency, by Branch
 Balance Sheet - Dr/Cr/Net
 Balance Sheet - Net/Budget/Var
 Balance Sheet - Budget by Branch
 Balance Sheet - Net by Currency
 Balance Sheet - Budget

Page 263 | 335


Advanced Analytics Platform Technical Guide

 Balance Sheet - Net by Currency, by Branch


 Balance Sheet - Dr/Cr/Net by Currency
 Balance Sheet Monthly Analysis
 Income Statement Monthly Analysis
 Balance Sheet Quarterly Analysis
 Balance Sheet Monthly Budget Analysis
 Balance Sheet Quarterly Budget Analysis
 Income Statement Monthly Budget Analysis
 Income Statement Quarterly Analysis
 Income Statement Quarterly Budget Analysis
 Balance Sheet Daily Analysis
 Income Statement Daily Analysis
 Balance Sheet Yearly Analysis
 Balance Sheet Yearly Budget Analysis
 Income Statement Yearly Analysis
 Balance Sheet - Budget Drillthrough
 Balance Sheet - Dr/Cr/Net Drillthrough

Dashboards
All Retail Analytics dashboards currently rely on data from the GLConsolidated object.

Pivot Reports
The following Pivot reports use GLConsolidated as their main source of data.

 Balance Sheet
 Balance Sheet Movements
 Balance Sheet MTD Analysis
 Balance Sheet YoY Analysis
 Balance Sheet YTD Analysis
 Income Statement
 Income Statement Movements
 Income Statement MTD Analysis
 Income Statement YoY Analysis
 Income Statement YTD Analysis

Custom Reports
All Custom Reports currently rely on data from the GLConsolidated object.

Page 264 | 335


Advanced Analytics Platform Technical Guide

Page 265 | 335


Advanced Analytics Platform Technical Guide

General Data Protection Regulation (GDPR)


The General Data Protection Regulation (commonly abbreviated to GDPR) is a replacement for the
European Union Data Protection Directive, which has governed the Protection of Personal Data in the
European Union (EU) since 1995.

Since the original directive was passed, there have been a series of rapid technological and business
advances which have brought new challenges to the use and protection of personal data. As a result, there
was a desire to produce an updated set of regulations to reflect the new technological and business
landscape.

In addition, the Lisbon Treaty created a new legal basis for a modernised and comprehensive approach to
data protection, including the free movement of data within the EU.

GDPR was designed to resolve three issues that have become apparent with the implementation of the
original legislation;

 Since the Data Protection Directive, there has been an inconsistent approach to the application of
data protection across the European Union, which have created barriers for business and public
authorities due to legal uncertainty and inconsistent enforcement.
 Difficulties for individuals to stay in control of their personal data
 Gaps and inconsistencies in the protection of personal data in the field of police and judicial co-
operation in criminal matters.
As part of the free movement of data within the EU, the GDPR gives data subjects (typically EU citizens) a
new series of rights with respect to their data. These rights are listed below;

 The Right to Be Informed


 The Right to Access
 The Right to Rectification
 The Right to Erasure
 The Right to Restrict Processing
 The Right to Portability
 The Right to Object
 Rights related to automated decision making and profiling
This is very important regulation as fines for the most serious data protection breaches are 4% of worldwide
turnover or €20 million (whichever is higher).

The key dates for the General Data Protection Regulation are;

 27th April 2016 – The GDPR is adopted


 24th May 2017 – Deadline for Companies to implement new privacy notice into annual policies
and products.
 25th May 2018 – GDPR comes into force
One key point to recognise is that to ensure a consistency of approach across the European Union, the
GDPR is a regulation and not a directive. As it is not a directive it does not need enabling legislation by
national governments to come into effect.

This document provides a high-level solution for GDPR and include the following:

 Rights of Erasure, a mechanism to capture and validate the request and trigger the erasure
process.This is done at the column level since different columns can have different required
retention periods.
 Personal Data Definition, a mechanism to build, maintain and store the metadata definition of
Personal data for Analytics data stores and a mechanism to import the PDD from Core Banking
Page 266 | 335
Advanced Analytics Platform Technical Guide

 Rights Management – Capture the request from Core Banking


 Rights to Object, and restrict processing, a mechanism to exclude customers from additional
processing in Analytics

High Level Solution


The Temenos Analytic Customer Data Protection solution for GDPR covers

 Importing metadata from Core Banking in order to act upon requests for the Right to Erasure.
 Consent Management.
 Identifying personal data across all Analytics data stores
 Sharing metadata details with all other Temenos products
 Support L2 country layer to identify personal data
 Support L3 Local development to identify personal data
In Scope

 Erasure Metadata management


 Rights of Erasure/be forgotten
 Consent Management
o Right to Restrict Processing
o Right to Restrict Marketing

Overview
This section contains an overview of Right to Erasure processing as well as Consent Management.

Right to Erasure Processing


The system allows erasure to be performed on tables in InsightLanding, which are raw dumps of Temenos
Core Banking/T24 extracts, and in InsightWarehouse (Analytics Data store) which is a modelled star
schema.

Metadata to define the above erasures will be stored in Analytics and will be populated by Core Banking
extracts, as well as Analytics application metadata configured by administrators of the Temenos Core
Banking Customer Data Protection module (commonly abbreviated to CDP).

The Temenos Analytic Customer Data Protection solution for GDPR consists of the following components:

 The Rules Engine data model.


 Mechanism to import Erasure Metadata from Core Banking into the Rules Engine data model.
Erasure metadata includes columns erased as well as the customers requesting the erasure.
 Mechanism to define the Erasure requirements for Analytics data stores (InsightWarehouse), and
to import this into the Rules Engine data model.
 Mechanism to perform the Erasure Process based on the stored metadata above.

Core Banking Customer Data Protection Extracts


 The DW Export Core Banking application will be configured to extract the Core Banking CDP
tables which contain, a daily report summarizing the results of the core banking erasures,
CDP_DATA_ERASED_TODAY, this contains all the Analytics application will need to know in order

Page 267 | 335


Advanced Analytics Platform Technical Guide

to erase Raw T24 data stored in InsightLanding. CZ_CUSTOMER_ACTIVITY will be extracted in


order to trigger erasures of Analytics InsightWarehouse data.

Populating Customer Data Protection Metadata into the Rules Engine Data Model
There are two processes, one for erasing raw Temenos Core Banking data extracts stored in InsightLanding,
and one for erasing columns in InsightWarehouse. The metadata for each process has the same form but
is populated differently.

1. InsightLanding Erasure (Raw Temenos Core Banking data)


An extract called CDP_DATA_ERASED_TODAY is provided daily by core banking’s DW.Export, this
extract contains the columns erased in T24 that day and for which customers, as well as the data value
that was used to replace the erased value. This data is inserted into the Rules Engine data model.
2. InsightWarehouse Erasure (Modelled Analytics Data)
Columns to be erased are configured by editing the InsightWarehouse..Datadictionary table, this data
as well as data from the Core Banking table Customer.Activity, as well as retention, and erasure
replacement value data from Analytics lookup tables are stored in the RulesEngine data model.

Erasure Process

 The erasure process will be triggered when the Action date in the above metadata is reached.
 The action date is based on the Erasure data in the case of data from
CDP_DATA_ERASED_TODAY, or based on configured retention periods per column and the date
the Customer became inactive in the case of Analytics data.
 Erasure will be able to run in validation mode so that it can be confirmed that a reasonable
amount of records are affected, and that the generated update statements are correct.
 Once erasure is in progress status will be set in RuleCustomers to Erasure In Progress.

Consent Management
Customer consent preferences are managed in core banking. Analytics imports the metadata relating to
consent and makes it available to Analytics users both in raw format, as a lookup table in InsightSource
and as flags in InsightWarehouse..DimCustomer.

InsightImport is configured to import the new CDP consent tables. Two RulesEngine rules are added to
create a Consent lookup table in InsightSource, and then to add data from that lookup table to
InsightWarehouse..DimCustomer.

Consent data is imported via DW Export from Core Banking. These consent products are rolled up to the
customer level in Data Manager as a Has Consent flag and exposed in InsightWarehouse.

Rights Management
 For the following Rights, automatic or manual Rights processing will be triggered once the
request is approved and authorised.
o Right to Erasure
o Right to Restrict processing

Page 268 | 335


Advanced Analytics Platform Technical Guide

 Customer activity will be stored in RuleCustomers. When customers have invoked rights in Core,
they will be entered in this table and a bridge table RuleCustomerRuleColumn will store the
personal data definition for this customer and the rights status for each column.

Considerations/Dependencies
The system is dependent on Customer Data Protection metadata originating from Temenos Core Banking.

Page 269 | 335


Advanced Analytics Platform Technical Guide

Right to Erasure Detailed Design


The design is based on the following principles. Metadata is converted from either Temenos Core Banking
or Analytics sources and stored in the Analytics Rules Engine Data Model in a specific format that allows
code to be generated which will result in the Right to Erasure being fulfilled.
Data Model
The tables used to manage erasures are the following:

 RuleDefinitions
 RuleColumns
 RuleCustomers
 RuleCustomersRuleColumns
 RuleReplacements
 CDPPurpose

Metadata
Metadata describing invoked rights to erasure is stored in the above data model.

Temenos Core Banking


The following tables from Temenos Core Banking will be used to act on the rights of Erasure.

T24 file Description


CDP_DATA_ERASED_TODAY Reports the columns erased in Core Banking per
customer on a particular day.
CUSTOMER.ACTIVITY Contains customer right to Erasure requests. Used
for Erasures in InsightWarehouse.

Analytics Metadata
Metadata will be used to create Analytics rules which can then be used to generate logic to erase customer
data.

Analytics lookup table Description


RuleReplacements Contains possible replacement values for Erasures.
CDPPurpose Contains retention periods for data corresponding to
a purpose.
Datadictionary New columns in InsightWarehouse DataDictionary
allow InsightWarehouse colomns to be assigned
Replacement Values and purpose codes.

Page 270 | 335


Advanced Analytics Platform Technical Guide

Metadata Mapping into the Analytics CDP Data Model


InsightLanding Erasures
Analytics Table T24 Source file Description
RuleDefinitions CDP_DATA_ERASED_TODAY Contains the sensitive customer
table.
RuleColumns CDP_DATA_ERASED_TODAY A group of sensitive columns in the
table above.

RuleCusomers CDP_DATA_ERASED_TODAY Contains customer right to Erasure


CUSTOMER_ACTIVITY requests.
RuleCustomersRuleColumns CDP_DATA_ERASED_TODAY An intersection table between
CUSTOMER_ACTIVITY RuleColumns and RuleCustomers.
Contains the RuleCustomer id, the
RuleColumnId and the Erasure Date
when the data from a particular
column should be erased.

InsightWarehouse Erasures
Analytics Table Source Data Description
RuleDefinitions Datadictionary Contains the sensitive customer
table.
RuleColumns Datadictionary A group of sensitive columns in the
table above.
RuleReplacements Lookup table with default Contains the list of erase options
values. that can be applied to each field
based on the data type.
CDPPurpose Lookup table with default Contains retention period per
values. purpose code.
RuleCusomers CUSTOMER.ACTIVITY Contains customer right to Erasure
requests.
RuleCustomersRuleColumns CUSTOMER.ACTIVITY An intersection table between
rulecolumns and rulecustomers.
Contains the RuleCustomer id, the
RuleColumnId and the ActionDate
when the data from a particular
column should be erased. This is
calculated based on the purpose
code associated with a particular
column. This is customer specific
since each purpose code has a set
number of retention days based on
Page 271 | 335
Advanced Analytics Platform Technical Guide

the date the customer became


inactive which is stored in the
Customer.Activity table.
InsightWarehouse..Datadictionary Datadictionary columns will be
manually mapped to source data
from T24 which is in the CDP Data
Definiton table. For example.
Customer.Name_1 from T24 is
First Name in the Datawarehouse.

This can not be done automatically


based on T24 metatata since there
are many transformations that
data can go through.

New Columns will be added.


CDPColumn: 1/0
CDPPurpose: Legal, Marketing,
Consent etc.
CDPEraseAction: Nullify, default.

Metadata Mapping into Analytics Rule Engine


The mapping described above is accomplished by means of views and stored procedures.

Views

Views Description
v_ConvertCDPRules Maps T24 CDP_DATA_ERASED_TODAY to
RuleDefinitions and RuleColumns column
names.
v_ConvertCDPRuleCustomers Maps T24 to CDP_DATA_ERASED_TODAY
and CUSTOMER_ACTIVITY to
RuleCustomers column names.
v_ConvertCDPRuleCustomersRuleColumns Maps T24 CDP_DATA_ERASED_TODAY to
RuleCustomersRuleColumns column
names
v_ConvertCDPAnalyticsRulesDimensions Maps InsightWarehouse..Datadictionary
v_ConvertCDPAnalyticsRulesFacts CDP columns to RuleDefinitions and
RuleColumns column names

Mapping Details

RuleDefinitions

Page 272 | 335


Advanced Analytics Platform Technical Guide

ColumnName Description Example Values


DatabaseName InsightLanding
InsightWarehouse
SchemaName The schema of the All
table. 20091231BS
All – will expand for all
available schema’s for a
given table name.
TableName The table with columns Customer
to be “Erased”
ExecutionPhase Set to Extract
ExecutionStep Set to 1
IsPersisted Set to 1
ItemName Set to CDP Item
Operation Set to GDPR
IsTemporal Where historical 0
changes are preserved.
For GRPR set to 0.
Changes will be made
to records in place.
RuleDefinitionID The surrogate key
SourceRuleDefinitionId The natural key used to
identify a record.
RuleDefinitionRowHash The key used to identify DBO.FN_HASHMAXUNICODESHA1CLR(UPPER(
whether any column CONVERT
added to this hash has (NVARCHAR(MAX),(SELECT
changed. [TABLENAME],[EXECUTIONPHASE],
[EXECUTIONSTEP],[DATABASENAME],
[SCHEMANAME],
[ITEMNAME],[OPERATION],
[ISTEMPORAL] FOR XML RAW)))) AS
RULEDEFINITIONROWHASH

RuleColumns

ColumnName Description Example Values


ColumnType Various column types Update – The column will be the one that
for different purposes. is actually Erased/ nullified.
BridgingColumn – Column Used to join
to bridge tables.
WhereColumn – Column used to for the
final where clause of the update
statement.
ColumnContents The columnname. For column type = Update:
AccountName,

Page 273 | 335


Advanced Analytics Platform Technical Guide

For Column Type = For column type = BridgingColumn:


Update. The column AccountId.
being updated. For column type = WhereColumn:
CustomerNum
ColumnOrder The order of the The column with Column type = Update:
column. 1
Needs to be sequential The first BridgingTable, column Type =
when Bridging Columns BridgingColumn: 2.
are used so that the The second BridgingTable, ColumnType
joins can be ordered = BridingColumn: 3.
correctly.
The WhereColumn, ColumnType =
WhereColumn: 4
JoinColumnName Used where CustomerId
columntype =
BridgingColumn. The
other side of the join
condition.
JoinTableName The name of the table In the case of updating a column in
to join to. Used when DimAccount, we’d have two records with
ColumnType = JoinTableName used.
BridgingColumn. FactAccount
DimCustomer
Bridging columns are
necessary when for
The resulting (simplified) query would
example we have to
be:
erase a column in
DimAccount for a
certain customer. Update DimAccount
DimAccount does not Set AccountName = ‘xxxxx_1’
have a customernum
From DimAccount
column to filter the
update so we need to Inner Join FactAccount
link through other On FactAccount.AccountId =
tables in order to get to DimAccount.AccountId
a table with a Inner Join DimCustomer
CustomerNum column.
On FactAccount.AccountId =
DimCustomer.AccountId
Where DimCustomer.CustomerNum =
1234

RuleReplacements

ColumnName Description Example Values

Page 274 | 335


Advanced Analytics Platform Technical Guide

TableType The type of table, used because Default


Dimensions need to be treated Dimension
differently
EraseOption The name of the record DW_Date_Default
BaseDataType The base data type Date
Alpha
Number
EraseAction What to replace the erased value Nullify
with. Default

Dimensions cannot be nullified.

This links to a similar value in


InsightWarehouse..DataDictionary.
EraseValue The actual value to use as a Nullvalue – will be replaced with an
replacement actual NULL
9999-12-31
A
0

CDPPurpose

ColumnName Description Example Values


CDPPurposeId LEGITIMATE
MARKETING
LEGAL
CONTRACT
TAX
CONSENT
Purpose
RetentionPeriod The period before 8D
records can be erased. 6M

RuleCustomers

ColumnName Description Example Values


CustNo The customer number.
CustomerActivityStatus The status of the INACTIVE
customer. Only ERASURE.IN.PROGRESS
customers which are in

Page 275 | 335


Advanced Analytics Platform Technical Guide

active can have


columns erased.
InactiveSinceDate The date when the
customer became
inactive.
ErasureDate When the columns for
the customer were
erased.

RuleCustomersRuleColumns

ColumnName Description Example Values


ActionDate The date when Erasure
can be performed.
RetentionPeriod Used for erasures in 6M
InsightWarehouse. The
period that the
customer data must be
retained before it can
be erased.

Used to calculate the


ActionDate.
ColumnReplacementValue The value that will be XXXXXXXX
used in place of the For Dimension Updates
value to be erased.
XXXX_1, XXXX_2 etc since record
uniqueness needs to be maintained.
EraseDataType Not required. Obtained
from sys.Columns
TableFilterValue The value of the @Id 164096.1
(RecordId) filter. Used
for the whereColumn
filter for data from
InsightLanding only.

Page 276 | 335


Advanced Analytics Platform Technical Guide

Stored Procedures

Stored Procedures Description


s_LoadCDPRules Loads CDP.Data.Definiton data into
RuleDefinitions and RuleColumns by calling the
view v_ConvertCDPRules
s_LoadCDPRuleCustomers Loads v_ConvertCDPRuleCustomers into
RuleCustomers.
s_LoadCDPRuleCustomersRuleColumns Loads T24 customer.activity and CDP.Purpose
and CDP.Data.Definiton to
RuleCustomersRuleColumns column names by
calling the view
v_ConvertCDPRuleCustomersRuleColumns
s_LoadCDPAnalyticsRules Loads all metadata tables with Analytics related
metadata.
The following views/ tables are called:
[tmp].[v_ConvertCDPAnalyticsRulesDimensions]
[tmp].[v_ConvertCDPAnalyticsRulesFacts]
InsightImport..Customer_Activity

The stored procedure maps the data into the


following tables in turn.
RuleDefinitions
RuleColumns
RuleCustomersRuleColumns

The stored procedures above call the stored procedure s_LoadTemporalData which does the actual
inserting/updating into the target metadata tables. Loads can be Temporal or not. Temporal means that
all changes will be preserved by business date.

Analytics Right To Erasure Logic


SQL update queries are generated based on the metadata detailed above.

An update statement is generated for each column and for each customer. Updates will be generated based
on the ActionDate for a particular Customers Column. Only columns with action dates in the past will be
updated.

Stored Procedures Description Parameters


s_CreateCDPUpdateLogic Returns an Update statement that @Columnlist – Table
Erases configured columns for a contaiing a list of columns

Page 277 | 335


Advanced Analytics Platform Technical Guide

certain customer. Inserts results @SchemaName – the


into RuleResultsLog. SchemaName either ‘All’ or
the actual schema.

Update statements can be simple if the table to be updated contains a customer number, if not then bridge
tables need to be added to the statement in order to link to a table that contains the customer number so
that the query can be filtered by the appropriate customer.

Executing Rules to Erase Sensitive Customer Columns


Once all metadata has been converted into the Analytics Rules Engine data model the Rules need to be
executed in order to do the actual “Erasure”.

The user flow is as follows:

 Automatically generate code to “Erase” Data based on the stored metadata.


 Validate Generated Code programmatically by testing the code and storing the number of records
that would be updated.
 An administrative User reviews the generated code and approves each statement.
 The User triggers Execution of the approved code.

Stored Procedures Description Parameters


s_ExecuteCDPRules This stored procedure will result in BusinessDate: The Business
the population of RuleResultsLog date.
with code to erase sensitive CustNo:Optional Customer
customer data, then will validate and Number, will do “All”
run the generated code. An Customers by default based
administrator should manually on the Retention dates.
approve all code since the
ExecuteMode: Validate or
consequences of an error would be
Execute
costly.

@DatabaseName
The procedure would first be run as
@SchemaName
Exec s_ExecuteCDPRules ‘2018-05-
12’,’All’, ‘Validate’ @TableName
This will mark each generated
statement for each customer as
IsValidated = 1, with a validation
record count which is the amount of
affected records.

At this point the record needs to be


manually updated to be ‘IsApproved
= 1’.
Then the procedure would be run as
Exec s_ExecuteCDPRules ‘2018-05-
12’,’All’, ‘Execute’. Which will run and
commit the update which erases the
Page 278 | 335
Advanced Analytics Platform Technical Guide

customer data. Customers metadata


will be updated to reflect that their
data has been erased.

Calls
s_CreateCDPUpdateStatements

s_CreateCDPUpdateSta Called by s_ExecuteCDPRules @DatabaseName


tements above, generates Column List based @SchemaName
on CDP metadata in RuleDefinitions
@TableName
and related tables, then Calls
s_CreateCDPUpdateLogic which @BusinessDate
inserts generated update statements
into RuleResultsLog.

Statements are created for


customers with ActionDates earlier
than the provided BusinessDate.

The final results of the above process are stored in RuleResultsLog.

RuleResultsLog

Column Description
SQLStatement The generated SQL statement that will “Erase” customer
data.
RuleDefinitionID The associated rule for the columns to be erased.
CustNo The customer number for which data will be erased.
LeadCoMne The Company of the Customer.
IsApproved Has a generated statement been approved? (Manual
Process). This column needs to be manually updated after
approval is verified.
ValidationRecordCount The amount of records to be updated. This is updated on
Validation.
IsValidated 0 or 1. Set when s_ExecuteCDPRules is run in validation
mode.
HasExecuted 0 or 1. Has the statement been run and data has been
erased.
ActionDate The date that the column can be erased based on retention
policy.
ExecutedDate The date when the statement was executed and commited

Agent Job
An agent job will run the following daily:

Page 279 | 335


Advanced Analytics Platform Technical Guide

Stored Procedure Description


s_LoadCDPRules Update CDP rule
s_LoadCDPRuleCustomersRuleColumns metadata
s_LoadCDPAnalyticsRules

s_CreateCDPUpdateStatements Run In Validate Mode


s_CreateCDPUpdateStatements Run In Execute Mode,
will run any statements
that have been
approved.

Page 280 | 335


Advanced Analytics Platform Technical Guide

Interface and Usage


The agent job would run the series of stored procedures daily in order to generate a list of system validated
update statements for each customer that has invoked the right to erasure.

At this point User approval is required due to the large impact of running a bad statement. Currently
approval would be done by means of review of RuleResultsLog table and manually marking the IsApproved
column as True. Once IsApproved and IsValidated are both true then the CDP Erasure update statement
will be executed.

Page 281 | 335


Advanced Analytics Platform Technical Guide

Configuration
The metadata to erase customer data comes from T24 via DW Export and from columns in
InsighWarehouse..Datadictionary. A number of changes are required to existing Analytics installations in
order to produce the required data.

See TFS IM Framework / Configuration for the latest configuration values. All configuration data is in XML
files.

InsightImport
Add the following records to InsightImport in order to import CDP tables from T24.

TableNa SChema Enabled T24Rebu T24Sele T24Multi- T24Sub- Configuration


me Name _ ildFromS ctionNa valueAssociati valueAssociation
electionif me on
Missing
CZ_CDP Dbo 1 Yes CZ_CDP SYS_FIELD_N CZ_CDP_DATA_ ModelBank_C
_DATA_ _DATA_ AME:SYS_FIEL DEFINITION_SY DP
DEFINIT DEFINIT D_ATTRIBUTE S_FIELD_NAME|
ION ION S:SYS_PURPO SYS_PURPOSE
SE:SYS_ERASE
_OPTION:SYS
_ACCESSIBILI
TY:SYS_EXCL
UDE
CZ_CDP Dbo 1 Yes CZ_CDP NULL NULL ModelBank_C
_ERASE _ERASE DP
_OPTIO _OPTIO
N N
CZ_CDP Dbo 1 Yes CZ_CDP NULL NULL ModelBank_C
_PURPO _PURPO DP
SE SE
CZ_CUS dbo 1 Yes CZ_CUS PURPOSE:ERA NULL ModelBank_C
TOMER_ TOMER_ SURE_DATE:E DP
ACTIVIT ACTIVIT RASURE_STAT
Y Y US,ACTIVE_C
ONTRACT_ID:
ACTIVE_CONT
RACT_APPLN:
ACTIVE_CONT
RACT_CO_CO
DE:ACTIVE_C
ONTRACT_LIN
K:CONTRACT_
CREATION_DA
TE,COMPLETE
D_CONTRACT
_ID:COMPLET
ED_CONTRAC
T_APPLN:COM
PLETED_CONT

Page 282 | 335


Advanced Analytics Platform Technical Guide

RACT_LINK:C
OMPLETED_C
ONTRACT_CO
_CODE:CONTR
ACT_END_DA
TE,OTHER_LI
NKED_APPLN:
OTHER_LINKE
D_RECORD:O
THER_LINKED
_CO_CODE:OT
HER_LINKED_
REC_STATUS
CZ_CDP Dbo 1 Yes CZ.CDP. RECORD_ID:C CZ_CDP_DATA_ ModelBank_C
_DATA_ DATA.ER OMPANY_ID:F ERASED_TODAY DP
ERASED ASED.T IELD_NAME:P _RECORD_ID|FI
_TODAY ODAY URPOSE:ERAS ELD_NAME:PUR
E_OPTION:NE POSE:ERASE_OP
W_FIELD_VAL TION:NEW_FIEL
UE D_VALUE
TableNa SChema Enabled T24Rebu T24Sele T24Multi- T24Sub- Configuration
me Name _ ildFromS ctionNa valueAssociati valueAssociation
electionif me on
Missing
CZ_CDP Dbo 1 Yes CZ_CDP SYS_FIELD_N CZ_CDP_DATA_ ModelBank_C
_DATA_ _DATA_ AME:SYS_FIEL DEFINITION_SY DP
DEFINIT DEFINIT D_ATTRIBUTE S_FIELD_NAME|
ION ION S:SYS_PURPO SYS_PURPOSE
SE:SYS_ERASE
_OPTION:SYS
_ACCESSIBILI
TY:SYS_EXCL
UDE
CZ_CDP Dbo 1 Yes CZ_CDP NULL NULL ModelBank_C
_ERASE _ERASE DP
_OPTIO _OPTIO
N N

Run InsightImport
Exec [Insight].[s_Import_Control] @PathName = 'C:\Datafoler', @TableName = 'All',
@ReCreateTables = 1, @TableType = 'Regular', @SystemTablesExist = 0, @BatchNum = null,
@TotalThreads = null;

Make sure the new tables are added to InsightImport. SELECT *


FROM [InsightImport].[Insight].[Entities]
Where Name like 'Cz%'

Exec [Insight].[s_Import_Control] @PathName = 'C:\Datafoler', @TableName = 'All',


@ReCreateTables = 0, @TableType = 'localref', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;

Page 283 | 335


Advanced Analytics Platform Technical Guide

Exec [Insight].[s_Import_Control] @PathName = 'C:\Datafoler', @TableName = 'All',


@ReCreateTables = 0, @TableType = 'MV', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;
Exec [Insight].[s_Import_Control] @PathName = 'C:\Datafoler', @TableName = 'All',
@ReCreateTables = 0, @TableType = 'MVSV', @SystemTablesExist = 1, @BatchNum = null,
@TotalThreads = null;

Publish Latest Database Insight


For older analytics installations, the latest version of database Insight is required to be published.

Publish Database InsightETL


For older Analytics instances that do not contain InsightETL, InsightETL should be published.

For new instances the additional objects need to be published.

Lookup Tables
Configure the following lookup tables.

InsightETL..RuleReplacements

Default values should be populated when InsightETL is published. At a minimum the following values should
be present. This data is used for erasure of InsightWarehouse data only, not InsightLanding. The data is
used to determine what to replace erased data with.

TableType EraseOption BaseDataType EraseAction EraseValue


NULL DATE DATE DEFAULT NULL
Default Date_Nullify Date Nullify Nullvalue
Default Alpha_Nullify Alpha Nullify Nullvalue
Default Number_Nullify Number Nullify Nullvalue
Default Date_Default Date Default 12/31/9999
Default Alpha_Default Alpha Default A
Default Number_Default Number Default 0
Dimension DW_Date_Default Date Default 12/31/9999
Dimension DW_Alpha_Default Alpha Default A
Dimension DW_Number_Default Number Default 0

InsightETL..CDPPurpose

Default values should be populated when InsightETL is published. At a minimum the following values should
be present. This data is used for erasure of InsightWarehouse data only, not InsightLanding. The data is
used to determine when InsightWarehouse data is erased.

CDPPurposeID Purpose RetentionPeriod


LEGITIMATE LEGITIMATE 4D
MARKETING CONSENT 7D
LEGAL LEGAL 8D
CONTRACT CONTRACT 8D

Page 284 | 335


Advanced Analytics Platform Technical Guide

TAX LEGAL 8D
CONSENT CONSENT 8D

InsightWarehouse..DataDictionary

Three columns should be added to InsightWarehouse..Datadictionary if CDP functionality is being added to


an existing Analytics instance.

ColumnName Description Values


CDPColumn Is the column a CDP column that can 1
be erased for a customer. 0
CDPPurpose The Purpose of the column, used to LEGITIMATE
link to the lookup table CONSENT
InsightETL.CDPPurpose
LEGAL
CONTRACT
CDPEraseAction The RuleReplacement type to Nullify
replace the erased value with. Links Default
to RuleReplacements.
Dimensions cannot be nullified since
Dimension record uniqueness needs
to be maintained.
Alter Table InsightWarehouse.[dbo].[DataDictionary]
Add [CDPColumn] int Null,
[CDPPurpose] nvarchar(50) Null,
[CDPEraseAction] nvarchar(50) Null

Review CDP_DATA_DEFINITION columns (Customer Sensitive) and mark corresponding columns in


InsightWarehouse..Datadictionary by setting the CDPColumn = 1, assigning a CDPPurpose and
CDPEraseAction.

For example an Erasable column in InsightImport..CDP_DATA_DEFINITION CUSTOMER.NAME.1 might


correspond to InsightWarehouse..DimCustomer.FirstName. In this case DimCustomer.FirstName would be
assigned CDPColumn = 1, as well as the appropriate CDPPurpose and CDPEraseAction.

Populate CDP Data Model


The tables RuleDefinitions, RuleColumns, RuleCustomers, RuleCustomersRuleColumns need to be
populated to test functionality.

The following stored procedures will populate the data model.


RuleCustomers and RuleCustomersRuleColumns will only populate if there are records in

CZ_CDP_DATA_ERASED_TODAY for InsightLanding erasures or the following returns data for


InsightWarehouse erasures.
Select * From InsightImport..CZ_CUSTOMER_ACTIVITY
Where INACTIVE_SINCE_DATE is not null

Exec [dbo].[s_LoadCDPRules]
Exec [dbo].[s_LoadCDPRuleCustomers]
Exec [dbo].[s_LoadCDPRuleCustomersRuleColumns]
Exec [dbo].[s_LoadCDPAnalyticsRules]

Page 285 | 335


Advanced Analytics Platform Technical Guide

Validate Data to Test


Assuming all the above has been configured and there are Customers with Erasure requests the following
can be run.
Exec InsightETL.[dbo].[s_ExecuteCDPRules]
@BusinessDate = '2018-12-24',
@CustNo = 'all',
@ExecuteMode = 'validate',
@DatabaseName = 'insightwarehouse',
@SchemaName = 'all',
@TableName = 'dimcustomer'

This will execute defined erasures in Validate mode and will populate the table.
InsightETL..RuleResultsLog with Erasure Statements.

Select * From InsightETL.dbo.RuleResultsLog

Where TableName = ‘DimCustomer’

There should be a record per Customer per Table.

If the validate is successful RuleResultsLog.IsValidated will be set to 1 and if records exist for the customer
ValidationRecord count will have a non 0 positive value.

Final Execution
Once data has been validated the update statements can be run as follows:

Exec InsightETL.[dbo].[s_ExecuteCDPRules]
@BusinessDate = '2018-12-24',
@CustNo = 'all',
@ExecuteMode = 'Execute',
@DatabaseName = 'insightwarehouse',
@SchemaName = 'all',
@TableName = 'dimcustomer' –-or ‘all’ for All tables.

Page 286 | 335


Advanced Analytics Platform Technical Guide

Consent Management
Datamodel

Database TableName Description


InsightETL RuleDefinitions Contains two rules for Consent.
1. Rule to Create
CustomerConsentList lookup
table. (InsightSource)

2. Rule to add columns from a


rollup of CustomerConsentList to
DimCustomer.

InsightETL RuleColumns Contains columns for tables above.

InsightSource CDPCustomerConsentList Contains a list of the current consents


provided by customers.

Table Details
This table will be populated by the RulesEngine using a Dataset rule and a SourceSQLStatement. The table
will be re-created every ETL load based on the contents of AA_Arrangement.

InsightSource.BS.CDPCustomerConsentList

Column Description
CustNo The customer Number
CompanyNo The customer company
LeadCoMne The company mnemonic of the customer
Product The consent product
StartDate The date consent was given.
EndDate The date consent was revoked.

InsightWarehouse.dbo.DimCustomer

Consent flags will be added to DimCustomer via the Datadictionary.

Column DataType Description

Page 287 | 335


Advanced Analytics Platform Technical Guide

CustomerNum The customer Number, existing column


RestrictProcess Int New column, Restrict processing code, 1,0
ing
RestrictMarketi Int New column, Restrict marketing code, 1,0
ng

Metadata
Temenos Core Banking
The following tables from Temenos Core Banking are used for Consent Management and will be used to
create the CDPCustomerConsentList lookup table.

T24 file Description


CK_CONSENT_TYPE Contains consent types, such as
Personal Loan Marketing
Credit Desicion Making
Direct Marketing
Data Profiling and Analytics
BNK_AA_ARRANGEMENT Contains the various consents that a customer has
provided.
CK_CDP_CONSENT_XREF Contains the consent ArrangementID and Customer
Number.

Rule Definitions
Two RuleDefinitons in the Rule Engine will be defined in the Rules Engine to create the required table.
CDPCustomerConsentList, and add the required columns to DimCustomer.

Dataset rules with SQL statements will be used. The first rule to create the CDPCustomerConsentList lookup
table will have a SQL statement selecting from AA_Arrangement and the other CDP consent tables. The
second rule will be a rollup of the lookup table in order to add consent flags at the customer level to
InsightWarehouse..DimCustomer.

Configuration
The metadata to manage customer consent comes from T24 via DW Export. A number of changes are
required to existing Analytics installations in order to produce the required data.

InsightImport
Add the following records to InsightImport in order to import CDP tables from T24.

Page 288 | 335


Advanced Analytics Platform Technical Guide

TableName SChemaNa Enabled_ T24Rebui T24SelectionN T24 T24S Configur


me ldFromSe ame Mul ub- ation
lectionifM ti- value
issing val Assoc
ue iation
Ass
oci
atio
n
CK_CDP_CONSENT_XRE Dbo 1 Yes CK.CDP.CONS ModelBa
F ENT.XREF nk_CDP
CK_CONSENT_TYPE Dbo 1 Yes CK.CONSENT. ModelBa
TYPE nk_CDP

InsightLanding
The following records are added to InsightLanding in order to make the tables flow from InsightImport.

Extr Sour Sour Targ Imp Imp


act ce ce et Imp ort ort
List Nam Source Sche Source Tabl ort Field Ord User Configurati
Id e DB ma Table e Flag s er Id on
InsightIm CK_CONSENT_TY NUL ModelBank
1 Dbo port Dbo PE L 1 * 1 Dbo _CDP
InsightIm CK_CDP_CONSEN NUL ModelBank
2 Dbo port Dbo T_XREF L 1 * 2 Dbo _CDP

InsightWarehouse Datadictionary
New consent flags (Yes/No) are added to the datadictionary. For example.

Simple

RestrictProcessing

RestrinctMarketing

Detailed

BiometricConsent

EmailConsent

GeolocationConsent

LetterConsent

PhoneConsent

SMSConsent

Page 289 | 335


Advanced Analytics Platform Technical Guide

RulesEngine Configuration
Two new rules are configured as follows.

InsightSource.BS.CDPCustomerConsentList lookup table


This RuleDefinition is a Dataset rule which utilises a SourceSQLStatement to create the table.

RuleDefinitions

Column Description Value


TenantId
RuleGroupId
DatabaseName The Database
name of the
table to be
created.
SchemaName
TableName The table name CDPCustomerConsentList
that will be
created in this
case.
ExecutionPhase The ETL phase. Configuration

ExecutionStep An additional 1
filter as above.
IsPersisted 1 – Add a 1
physical column
to a base table.
0 – Create a
view around a
base table with
the column(s)
added to the
view.
Description Creates CDPCustomerConsentList Table
RuleDefinitionId Unique ID of
the Rule
definition.
LegacyItemId The ID of the
Rule in legacy
processes.
ItemName CDPCustomerConsentList
Operation Lookup Set to ‘Dataset’
Banding
Calculation
Split
MaxColumn
Dataset
Page 290 | 335
Advanced Analytics Platform Technical Guide

DatasetTable

SourceSQLStatem Used for Simple:


ent Operation =
Dataset
SELECT DISTINCT A.lead_co_mne AS LeadCoM
ne,
A.customer AS Custome
rNum,
A.[@id] AS Arrange
mentId,
product_line AS Product
Line,
A.product_group AS Product
Group,
A.product AS Product
,
A.arr_status AS Consent
Status,
A.product AS Product
Name,
A.start_date AS StartDa
te,
ACT.consent_type AS Consent
Type,
ACT.consent_given AS Consent
Given
FROM insightsource.bs.aa_arrangement A
INNER JOIN (SELECT ac1.lead_co_mne,
Ac1.id_comp_1,
ac1.[@id]
FROM insightsource.[BS]
.aa_arr_cdp_consent ac1
INNER JOIN (SELECT
lead_co_mne,

id_comp_1,

Max(id_comp_3) AS ID_Comp_3
FROM
insightsource.[BS]
.aa_arr_cdp_consent ac1
GROUP
BY lead_co_mne,

id_comp_1) dtac
ON ac1.id_
comp_1 = dtac.id_comp_1
AND ac1
.id_comp_3 = dtac.id_comp_3) AC
ON A.[@id] = AC.id_comp_1
AND A.lead_co_mne = AC.lea
d_co_mne
INNER JOIN insightsource.[BS].[aa_arr

Page 291 | 335


Advanced Analytics Platform Technical Guide

_cdp_consent_consent_type] ACT
ON AC.[@id] = ACT.[@id]

Detailed:

SELECT DISTINCT a.lead_co_mne


AS leadcomne,
a.customer
AS customernum,
a.[@id]
AS arrangementid,
a.product_line
AS productline,
a.product_group
AS productgroup,
a.product
AS product,
a.arr_status
AS consentstatus,
a.product
AS productname,
a.start_date
AS startdate,
act.consent_type
AS consenttype,
act.consent_given
AS consentgiven,
actst.consent_sub_type
AS consentsubtype,
actst.sub_type_consent_given
AS consentgivensubtype
FROM insightsource.bs.aa_arrangement A
INNER JOIN (SELECT ac1.lead_co_mne,
ac1.id_comp_1,
ac1.[@id]
FROM insightsource.[BS]
.aa_arr_cdp_consent ac1
INNER JOIN (SELECT
lead_co_mne,

id_comp_1,

Max(id_comp_3) AS id_comp_3
FROM
insightsource.[BS]
.aa_arr_cdp_consent ac1
GROUP
BY lead_co_mne,

id_comp_1) dtac
ON ac1.id_
comp_1 = dtac.id_comp_1
AND ac1
.id_comp_3 = dtac.id_comp_3) AC

Page 292 | 335


Advanced Analytics Platform Technical Guide

ON a.[@id] = ac.id_comp_1
AND a.lead_co_mne = ac.lea
d_co_mne
INNER JOIN insightsource.[BS].[aa_arr
_cdp_consent_consent_type] ACT
ON ac.[@id] = act.[@id]
INNER JOIN
insightsource.[bs].aa_arr_cdp_consent
_consent_type_consent_sub_type
ACTST
ON act.[@id] = actst.[@id]
AND act.sequence = actst.m
vsequence

Page 293 | 335


Advanced Analytics Platform Technical Guide

Platform Configuration and Customization


This chapter illustrates a number of commonly used configuration and customization task within the
Advanced Analytics Platform.

Importing a New Table with SQL Script


This section demonstrates how to import the content of a Core Banking table using the tools of MS SQL
Server Management Studio. However, this section does not cover the use of DW.EXPORT and the extraction
of CSV files from Core Banking. Please refer to DW.EXPORT documentation for more information.

Creating CSV file and coping it to the database server


The first step to import a new Core Banking table in any Analytics Platform is to ensure that a CSV is
extracted for the Core Banking table that we want to import in the database layer. In addition, users need
to ensure that this file is copied to the import folder referenced by the Process Data ExStore/ Analytics ETL
jobs.

Duplicate-checking
The second step of the process requires users to check the SourceSchema table in the InsightImport
database. This table contains a list of all the DW.EXPORT CSV files extracted from Temenos Core Banking
that should be loaded into InsightImport. Users can modify the SourceSchema table manually in SSMS,
through a SQL/T-SQL script, or with the aid of a version control tool like MS Team Foundation Server,
however this example relies on MS SSMS.

Users can run a query in SourceSchema to ensure that a definition for the Core Banking table they want to
import does not exist already and, hence, to avoid duplicates. An example of a query that searches in
SourceSchema a definition for a Core Banking table called ‘COLLATERAL_RATING’ is presented in Figure
38. If the query is executed successfully but it returns no result as shown below, users can proceed to the
Next step.

Figure 38 - Sample of MS SQL Query used to check for duplicates

Else, if a SourceSchema definition exists but the table does not get properly imported in the database layer
during Analytics ETL/ Process Data ExStore, users will have to troubleshoot and investigate why the upload
fails. This process, however, is beyond the scope of this section.

A similar query should be also run from the ExtractList table of the Insightlanding database. As higlighter
earlier, this table contains the list of tables to be imported and archived in InsightLanding from
InsightImport and potentially also from other third party source systems.

Page 294 | 335


Advanced Analytics Platform Technical Guide

Updating SourceSchema in the InsightImport Database


The third step is to write into SourceSchema a new entry defining the Core Banking table that need to be
imported.

Figure 39 shows an example of a SQL script that will insert in the SourceSchema table a new row for a
table called COLLATERAL_RATING. Please note that, if users want the table to be uploaded to
InsightImport, the value of the Enabled_ column must be set to ‘1’. Furthermore, they will have to set the
value of the Configuration column to ‘Local’ as this new column will belong to the Local Configuration layer
of the platform.

Figure 39 – Sample of insert statement to update SourceSchema with new table's definition

An important comment to make is that, in the example above, the SourceSchema columns used to define
parsing rules are not configured. If the imported Core Banking table included local reference fields, multi-
values or sub-values, we would need to include values for parsing-related columns in the INSERT INTO
script. Similarly, if we wanted to configure the table for Online extraction, the INSERT INTO statement
should also be configured to set the value of the OnlineProcess column to 1 for the current table.

Once the INSERT INTO query has successfully run, it is good practice to execute a SELECT query on the
SourceSchema table to ensure that we have entered the details of our Local configuration correctly, as
shown in Figure 40.

Figure 40 – SQL Query to check the locally developed content of SourceSchema

Updating ExtractList in the InsightLanding Database


Once the definition of the new table has been included in SourceSchema, users should also insert an
equivalent new entry in the ExtractList table of the InsightLanding database. As SourceSchema defines
which Core Banking tables should be loaded into InsightImport, ExtractList configures the tables to be
stored and archived in InsightLanding when Process Data ExStore or Analytics ETL run.

Figure 41 illustrates a sample INSERT INTO query to add a definition for the COLLATERAL_RATING table
to ExtractList. Please note that, in order to ensure that the table defined in ExtractList is actually stored in
InsightLanding during Process Data ExStore/ Analytics ETL, users need to set the Import Flag column to
Page 295 | 335
Advanced Analytics Platform Technical Guide

‘1’. In addition to this, we will have to set the value of the Configuration column to ‘Local’ for any new
column added to the Local configuration layer of a financial institution. Another important note is that the
statement shown below does not incude any configuration for online processing.

Figure 41 – Sample of insert statement to update ExtractList with new table's definition

Once the INSERT INTO query has successfully run, it is good practice to execute a SELECT query on the
ExtractList table to ensure that the details of our Local configuration entry have been entered correctly, as
shown below.

Figure 42 – SQ Query to check the locally developed content of ExtractList

Executing Process Data ExStore or Analytics ETL


Once the above configuration is completed, users will need to finalize the import of the new Core Banking
table in the database platform. This can be carried out either by the Process Data ExStore Agent job or by
the Analytics ETL job.

If Process Data ExStore is execute, the newly configured table – together with any other table defined in
SourceSchema and Extract List – will be first imported, data profiled and data typed in Insight Import and
then stored and archived in InsightLanding.

In addition to this, users can also run Analytics ETL to import the new table in InsightImport and
InsightLanding. This Agent job will also create a copy of the table in the InsightSource database. However,
if users also want to import one or more columns of our new table to one of the tables of the
InsightWarehouse database or to the OLAP Cubes, they will have to apply some additional configuration to
the Advanced Analytics Platform. A sample of this kind of configuration will be shown in the following
sections.

Post-Update Checks
The last step of the process is ensuring that the new table has been imported as expected in our Analytics
Platform.

InsightImport
In order to assess if the upload of the new table in InsightImport was successful, users can begin by
checking the content of the Insight.Entities table, as shown in the image below. This table contains a list
of tables to be processed and it is populated by the s_InsightImportTableList_Create stored procedure,

Page 296 | 335


Advanced Analytics Platform Technical Guide

which is executed both during Analytics ETL and Process Data ExStore. Therefore, the newly added table
will only appear here if it was successfully stored in Insight Import

Figure 43 - Sample of MS SQL Query used to check if locally developed field has been imported in InsightImport

Then, users should then check the content of the Tables folder in InsightImport to ensure that our new
table is amongst the imported ones. Again, it should be noted that, had the definition of our table in
SourceSchema requested the parsing of one or more local reference fields, multi-values or sub-values,
multiple tables would appear in InsightImport > Tables. E.g. if the Core Banking COLLATERAL.RATING
table had one reference field to be parsed in Analytics, the InsightImport > Tables folder would store two
associated tables after the import process i.e. COLLATERAL_RATING and COLLATERAL_RATING_LocalRef.
In the example chosen here, however, not parsing was defined and hence only the COLLATERAL_RATING
table will appear.

Once users have ensured that our table or tables is/are in the InsightImport > Tables folder, they can also
expand the Columns folder of the table and check its content. If the columns displayed here match with
the columns of the corresponding CSV files and if they are assigned the proper data type, it means that
our new table structure is accurate and that was properly data-profiled.

An example of a properly structured COLLATERAL_RATING table in InsightImport > Tables is shown in


Figure 44.

Page 297 | 335


Advanced Analytics Platform Technical Guide

Figure 44 - Sample of properly structured imported table in in InsightImport

The last check to perform in InsightImport is running a query on the imported table to ensure that all
records present in the corresponding CSV file were imported correctly, as shown in Figure 45.

Figure 45 - Sample of properly populated imported table in in InsightImport

InsightLanding
Users should now check that the new table has also been loaded into the InsightLanding database. To do
so, they can open the InsightLanding>Tables folder and look for our table’s name, e.g.
BS.COLLATERAL_RATING, where BS stands for Banking System (meaning Temenos Core Banking). The
naming convention for all Landing tables is <Source System Name>.<Table Name>.

Page 298 | 335


Advanced Analytics Platform Technical Guide

If multiple tables had been created for COLLATERAL.RATING in InsightImport as a result of parsing, all
these tables would be also copied into InsightLanding. However, this is not the case in this example, as
shown in Figure 46.

Figure 46 - Sample of imported table in in InsightLanding

We will have to query the new table in InsightLanding in order to ensure that the table in InsightLanding
is correctly populated. The BS.COLLATERAL_RATING table in InsightLanding will contain multiple days of
data. For this reason and due to the Columnstore Index, users should never query on this table through a
SELECT * statement. The select statement should explicitly include all the columns users are are interested
in, instead. The fastest way to query on a new InsightLanding table is, therefore, to right-click on it and
select the top 1000 records, as show in the image below. In this scenario, only one day of data has been
loaded so there will be no need to filter on MIS_DATE.

Figure 47 - Sample of properly populated imported table in in InsightLanding

InsightSource
The InsightSource database will be updated with our newly imported table only if the Analytics ETL job is
executed in an Advanced Analytics Platform.

This database is used to integrate the latest data coming from potentially multiple source systems. Tables
in InsightSource will be replicas of the latest tables in InsightLanding, consequently the naming convention
for tables in this database should be <Source System Name>.<Table Name> e.g.
BS.COLLATERAL_RATING for our sample table. Figure 48 shows a partial screenshot of the InsightSource
> Tables folder when the COLLATERAL_RATING table has been properly loaded to this database.

Figure 48 - Sample of imported table in in InsightSource

Page 299 | 335


Advanced Analytics Platform Technical Guide

Finally, we should run a query on the table imported in InsightSource to check that it has been populated
correctly with all records as shown in Figure 49.

Figure 49 - Sample of properly populated imported table in in InsightSource

Adding Attribute Calculation with SQL Script – Split Operation


This section demonstrates how to include into the ETL processing an operation that divides a compound
value of a column into separated values in an Analytics Database platform. This kind of attributes calculation
is called “Split operation” and transforms a single column extracted from a source system (e.g. Temenos
Core banking) into multiple columns stored in another database within the Analytics Platform (e.g. Insight
Landing or Insight Source). It is used both to improve Process Data ExStore’s and Analytics ETL’s
performances and to facilitate the creation of multiple Insight Warehouse columns from a single source
system column. The instructions included here apply to all three R19 Analytics Platforms, i.e. Data ExStore,
Reporting Platform and Advanced Analytics Platform.

Users can design a split operation through the tools of MS SQL Server Management Studio on the
InsightETL..AttributeCalculations table or with the aid of a version control tool like MS Team Foundation
Server, however the example used in this section relies on MS SSMS.

The following instructions will illustrate how to configure a split operation in the AttributeCalculations table,
stored in the InsightETL database, using the @ID column in the T24 Temenos Core Banking’s LIMIT table
as an example. Furthermore, we will test how the split process is carried out during Analytics ETL based
on this configuration.

Pre-Update checks
Before users start working on our Attribute Calculation, they should examine the structure of the column
to split by executing a query on the LIMIT table in the InsightImport database as shown in Figure 50.

Page 300 | 335


Advanced Analytics Platform Technical Guide

Figure 50 - Sample of query on the column to split

As shown in Figure 50, the LIMIT @ID consists of a compound attribute which combines the values of
CUSTOMER @ID, LIMIT.REFERENCE @ID and a sequence number separated by a dot (‘.’). For this reason,
each value will be split into three parts and indicate the dot as a separator.

Also, users will have to ensure that a Split rule for this column does not exist yet in InsightETL, to avoid
duplicates. The query in Figure 51 investigates on whether other Attributes Calculations are already setup
for the LIMIT table and shows there are a number of Calculations defined for LIMIT in the InsightImport
and InsightSource databases within the ModelBank and Framework configuration layers. However, there is
no existing pre-defined @ID column split. Users can therefore proceed and configure the new split
operation.

Figure 51 - Sample of MS SQL Query used to check for duplicates

Updating the AttributeCalculations table in the InsightETL Database


Once pre-update checks are completed, users can create a new definition for our split in the
AttributeCalculations table in the InsightETL database through a statement similar to the one illustrated in
Figure 52.
Page 301 | 335
Advanced Analytics Platform Technical Guide

If users want the column to be split during Analytics ETL or Process ExStore, the value of the Enabled_
column must be set to ‘1’. In addition to this, users will have to set the value of the Configuration column
to ‘Local’, as this new column will belong to the Local Configuration layer of the platform, and they have to
specify the target Database, Schema/ Table Name and Column in which the split should take place.

Figure 52 – Sample of insert statement to update AttributeCalcultation with new Split's definition

Once the INSERT INTO query has successfully run, it is good practice to execute a SELECT query on the
AttributeCalculations table to ensure that the details of the Local configuration were entered correctly, as
shown in Figure 53.

Figure 53 – SQL Query to check the locally developed content of AttributeCalculation

Executing the Column Split


Users need to finalize the column split process on the column they selected, according to the instructions
defined in the AttributeCalculations table in InsightETL.

In general, a Split process can be carried out either by the Process Data ExStore Agent job or by the
Analytics ETL job.

More specifically, the Process Data ExStore Agent job contains a step called Insight Attribute Calculations
which will execute splits for columns stored in InsightImport tables. The Analytics ETL Agent job, instead,
includes the Insight Attribute Calculations-Import step that also applies splits to columns in the
InsightImport database’s tables. In addition to this, it also includes the Insight Attribute Calculations-Source
step that applies splits to columns in the InsightSource’s tables while any split in the InsightStaging
database will be looked after by the core InsightStaging Update ETL steps.

It should be noted that, the ModelBank and Framework configuration layers of the Data ExStore, Reporting
Platform and Advanced Analytics Platform only contain Split operations targeting the InsightImport,
InsightSource and InsightStaging databases. However, additional steps can be locally added to both agent
jobs to process any split operation applied to InsightLanding.

In our specific example, InsightSource is selected as this target database for the split. For this reason,
users will run the Analytics ETL job to process the split.

In case a full Analytics ETL has already been executed for the current business date and users do not want
to re-run it, it is possible to only execute the step in charge of processing the Split. This step consists of a
call to the InsightETL.[dbo].s_CreateColumnCalculations stored procedure using the name of the target
database as a first parameter. If we run this stored procedure as a standalone query, we can also specify
the target Table and Schema to process as a second and third input parameters, respectively.
Page 302 | 335
Advanced Analytics Platform Technical Guide

Post-Update Checks
Once the column split has been executed successfully, users can run a query on the target table to check
the end-result of the process. The query in Figure 54 selects the content of the BS.LIMIT table in
InsightSource because this is where our @ID split has taken place. Please note that the columns storing
the LIMIT @ID split information are called ‘@ID_POS1’, ‘@ID_POS2’ etc. because we assigned the value
‘POS’ to the ColumnSplitNameSuffix parameter in the AttributeCalculations definition for our Split rule.

Figure 54 - Sample of MS SQL Query used to check if locally developed split was execute correcty

Adding Attribute Calculation with SQL Script – Calculation


This section demonstrates how to include into the ETL processing an operation that determines a value by
applying a SQL Expression on a column’s attribute in an Analytics Database platform. This kind of attributes
calculation is simply referred to as “Calculation”. Like for split operations, calculations’ definitions are stored
in the AttributeCalculations table of the InsightETL database.

Users can design a split operation through the tools of MS SQL Server Management Studio on the
InsightETL..AttributeCalculations table or with the aid of a version control tool like MS Team Foundation
Server, however the example used in this section relies on MS SSMS.

The following example shows how to configure a calculation which figures out the date in which a LIMIT
record was last updated, using the DATE_TIME column in the T24 Temenos Core Banking’s LIMIT table as
an example. Furthermore, it tests of how the calculation process is carried out during Analytics ETL based
on this configuration.

Pre-Update checks
Before we start working on our Attribute Calculation, we should examine the structure of the column we
want to use as a basis for our calculation by executing a query on the LIMIT table in the InsightImport
database as shown in Figure 55.

Page 303 | 335


Advanced Analytics Platform Technical Guide

Figure 55 - Sample of query on the source column for our calculation

DATE_TIME is an audit column present in most Temenos Core Banking tables. It includes infromation about
the date and time of the last change on a record – specifically, the date part is stored in the fist 6 digits of
the DATE_TIME value using a YYMMDD format e.g. if the value of our DATE_TIME column is 1705140709,
the date of the last change on the limit will be specified in the 170514 string, i.e. it will be the 14 th of May
2017. The expression to be define in the calculation will extract the date string from the DATE_TIME column
and load it into the LIMIT table into a new column called ETL_DATE in a YYYYMMDD format (e.g. the 14 th
of May 2017 will be stored as ‘20170514’).

After checking out the DATE_TIME column structure, users have to ensure that a Calculation rule used to
figure out the latest update date does not exist yet in InsightETL, to avoid duplicates. The query in Figure
56 will investigate on whether other Attributes Calculations involving the DATE_TIME column are already
setup for the LIMIT table. The query above shows us that there is no existing pre-defined calculation relying
on the DATE_TIME column. Users can therefore proceed with the new configuration.

Figure 56 - Sample of MS SQL Query used to check for duplicates

Updating the AttributeCalculations table in the InsightETL Database


Once the pre-update checks are completed, users can create a new definition for the calculation in the
AttributeCalculations table in the InsightETL database. This is done in MS SQL Server Management Studio,
using an INSERT INTO statement similar to the one shown in Figure 57.

Please note that, if users want a column to be calculated during Analytics ETL or Process ExStore, the value
of the Enabled_ column must be set to ‘1’. In addition to this, they have to set the value of the Configuration
column to ‘Local’, as this new column will belong to the Local Configuration layer of the platform, and they
have to specify the target Database, Schema Name and Table Name in which the calculation should take
place. Unlike in the split operation, users do not have to provide a target column name and a
ColumnSplitNameSuffix in this calculation definition – users have to include a value for the SQLExpression

Page 304 | 335


Advanced Analytics Platform Technical Guide

and for the CalculationColumnName columns, instead, which will store the T-SQL expression used for the
calculation and the name of the new column where the results of the calculation will be stored, respectively.

Figure 57 – Sample of insert statement to update AttributeCalcultation with new Calculation's definition

Once the INSERT INTO query has successfully run, it is good practice to execute a SELECT query on the
AttributeCalculations table to ensure that details of our Local configuration were entered correctly, as shown
in Figure 58.

Figure 58 – SQL Query to check the locally developed content of AttributeCalculation

Executing the Column Calculation


Users need to finalize the column calculation process according to the instructions defined in the
AttributeCalculations table in InsightETL.

Calculation can be carried out either by the Process Data ExStore Agent job or by the Analytics ETL job,
depending on the target database selected for the calculation rule.

More specifically, the Process Data ExStore Agent job contains a step called Insight Attribute Calculations
which will execute calculations for columns stored in InsightImport tables. The Analytics ETL Agent job,
instead, includes the Insight Attribute Calculations-Import step that also applies calculations to columns in
the InsightImport database’s tables. In addition to this, it also includes the Insight Attribute Calculations-
Source step that applies calculations to columns in the InsightSource’s tables while any calculations in the
InsightStaging database will be looked after by the core InsightStaging Update ETL steps.

It should be noted that, the ModelBank and Framework configuration layers of the Data ExStore, Reporting
Platform and Advanced Analytics Platform only contain calculations targeting the InsightImport,
InsightSource and InsightStaging databases. However, additional steps can be locally added to both agent
jobs to process any caluclation applied to InsightLanding.

In the specific case of our calculation example, however, InsightSource is the target database. Therefore,
users should use the Analytics ETL Agent job as it includes the Insight Attribute Calculations-Source step
that applies Calculations to columns in the InsightSource’s tables.

The alternative, if we do not want to execute a full Analytics ETL is to just run the step in charge of
processing the calculation. This step consists of a call to the InsightETL.[dbo].s_CreateColumnCalculations
stored procedure using the name of the target database as a first parameter. If we run this stored procedure
as a standalone query, we can also specify the target Table and Schema to process as a second and third
input parameters, respectively.

Page 305 | 335


Advanced Analytics Platform Technical Guide

Post-Update Checks
Once the calculation has been executed successfully, users can run a query on the target table to check
the end-result of the process. Figure 59 shows a query selecting the content of the BS.LIMIT table in
InsightSource because this is where the date calculation has taken place. The columns storing the date of
the last change in the LIMIT’s record is called ‘ETL_CHANGE_DATE’ because this value was assigned to the
CalculationColumnName parameter in the calculation definition.

Figure 59 - Sample of MS SQL Query used to check if locally developed calculation was execute correcty

Adding a Budget Source System with SQL Script


This section demonstrates how to add a Budget source system (or any other third party system) to an
Analytics Platform. This can be achieved by using the tools of MS SQL Server Management Studio and SQL
Script or a source code management system like MS Team Foundation Server however the former will be
used in this example.

This section will take for granted that the Budget data used is already suitable to be imported into Insight
Landing – the kind of Budget data processing needed may vary depending on the original format of Budget
data and should be discussed with the Analytics Product team.

Checking Budget data format


Before users start integrating a Budget third party system in the platform, they should check the table or
tables within the source system database and its or their columns in SQL Server Management Studio.

In this example, all Budget data is stored in a single table called GLBudget. GLBudget, in turn, is structured
into 16 columns, whose primary key is the LINE_ID (see Figure 60). This is the standard Budget format
for which Analytics Platforms provide out-of-the box views, tables and columns within the ModelBank
configuration layer.

Page 306 | 335


Advanced Analytics Platform Technical Guide

Figure 60 - Sample of Budget table format in MS SQL Query

Configuring Budget table load in InsightLanding


The second step of this process requires to enable the load of our Budget table/s, e.g. GLBudget, into the
InsightLanding database. This is done by adding a definition for each of the Budget tables that should be
imported in the ExtractList table and by including a Budget-specific entry into the ExtractSourceDate table.

Updating ExtractList
ExtractList is a configuration table storing the definition and the extract configuration parameters for all the
tables to be stored and archived into InsightLanding, loaded from either InsightImport or from third party
source systems like Budget.

Before updating ExtractList, it is good practice to ensure that a Budget table definition does not exist
already, in order to avoid duplicates. Figure 61 shows a sample of a quety to check this.

Figure 61 - Sample of MS SQL Query used to check for duplicates

Once users have ensured that no existing ExtractList definition exists, they can create one similar to the
sample of INSERT INTO statement illustrated in Figure 62.

Page 307 | 335


Advanced Analytics Platform Technical Guide

Figure 62 – Sample of insert statement to update ExtractList with new definition

If the table has to be uploaded to InsightLanding, the value of the ImportFlag column must be set to ‘1’.
Furthermore, the value of the Configuration column should be set to ‘Local’ as this new column will belong
to the Local Configuration layer.

Once the INSERT INTO script has been run successfully, users should query again the Extractist table to
ensure the new entry has been added correctly, as shown below.

Figure 63 – SQL Query to check the locally developed content of ExtractList

Updating ExtractSourceDate
The ExtractSourceDate table stores the queries that are used to retrieve the current extract date from the
source system data. The date returned from the query will be used to create the date portion of the schema
used for each table stored in Insight Landing. One record for each source system defined in the Extract
List table is required and for this reason we need to create a new entry also for the Budget source system.

Users should check the existing content of ExtractSourceDate in MS SSMS before adding any new entries,
to avoid duplicates. To do so, they can use a query similar to the one illustrated in Figure 64 – for each
entry, the name of the source system is identified in the SourceName column and a Budget-related
definition does not exist yet in this example.

Figure 64 - Sample of MS SQL Query used to check for duplicates

Once ensured that no duplicates exist, users can input a Budget-related entry executing a script similar to
the one showed in Figure 65.

Page 308 | 335


Advanced Analytics Platform Technical Guide

Figure 65 – Sample of insert statement to update ExtractSourceDate with new definition

When the script to update ExtractSourceDate has executed successfully, users hsould query again this table
to ensure that all columns were populated correctly (see Figure 66).

Figure 66 – SQL Query to check the locally developed content of ExtractSourceDate

Customizing and testing Analytics ETL Agent Job


Users should update Analytics ETL in order for Budget data to be processed when this agent job is executed.
Specifically, users will add a new step running in InsightLanding for the landing and archiving of Budget
table/s and they will modify the step executing the InsightSource update to also include Budget tables from
InsightLanding. Once all the required changes are applied, users will run a partial Analytics ETL to test
these two steps.

Create InsightLanding CSI Budget Update step


Users will add a new step similar to the one showed in Figure 67, called ‘InsightLanding CSI Budget Update’,
to the Analytics ETL agent job. Please note that the step below is basically a copy of the ‘InsightLanding
BS Update’ step as it executes the dbo.s_InsightLanding_CSI_Table_Update stored procedures in the
InsightLanding database. The first input parameter of this stored procedure, however, will have value
‘Budget’ in this new step as, when it is executed, the data loaded and archived in InsightLanding will be
extracted from the Budget source system.

Page 309 | 335


Advanced Analytics Platform Technical Guide

Figure 67 – Agent Job Step to load Budget data to InsightLanding

Once the new step has been created, it will have to be positioned just after the InsightLanding CSI BS
Update step, targeting the BS (i.e. Banking System) source system.

Modifying InsightSource Update step


Then, users will have to modify the “InsightSource Update” step, which comes as already pre-configured
as part of the out-of-the-box Analytics ETL agent job. Specifically, they will have to add a new line of code
to the Command section of the step, in which the s_InsightSource_CSI_Update stored procedure is
executed using ‘Budget’ as value for the first parameter (i.e. @sources) – this new line of code, will ensure
that the latest entries from Budget tables are loaded from InsightLanding to InsightSource. A sample of
these updates is shown in Figure 68.

Page 310 | 335


Advanced Analytics Platform Technical Guide

Figure 68 – Agent Job Step to load Budget data to InsightSource

Test changes in Analytics ETL


Now that changes are completed in Analytics ETL, user can run a partial or full process to test our updates.
If ETL has not been executed yet in the platform for the current business date, it should be started from
Step 1. If Banking System data has already been processed, instead, users can simply execute the
InsightLanding CSI Budget Update and the InsightSource Update steps.

Once these steps run successfully, the content GLBudget table will be copied to the InsightLanding
database, together with a Budget-specific instance of the SourceDate table. The schema for both these
tables will be the string BUDGET as shown in Figure 69.

Figure 69 – Budget tables successfully loaded into InsightLanding

Moreover, a copy of the latest Budget tables in InsightLanding will be also loaded in InsightSource with
schema ‘Budget’ – both in InsightLanding and in InsightSource, the columns and structure of the GLBudget
table will reflect the columns and structure of the table with the same name within the Budget source
system, as shown in Figure 70.

Page 311 | 335


Advanced Analytics Platform Technical Guide

Figure 70 – Budget tables successfully loaded into InsightSource

Please note that, even if we executed a full Analytics ETL, Budget data will be only loaded up to the
InsightSource database, at this stage, and not fully processed to the InsightWarehouse database. The
reason for this is that we have not performed the necessary configuration in InsightStaging yet – this
configuration will be illustrated in the next section of this document.

Configuring Budget data processing in InsightStaging


Users have now to configure the new Budget source system in the InsightStaging database. In order to do
so, they will need to enable the Budget source system in the Systems table and they will have to check the
structure of the out-of-the-box Budget v_source view. In case the structure of this view does not reflect
the structure of our Budget source table/s, users will have to create a new v_source view to include the
required local development and also to include the steps for processing the new v_source view in the
UpdateOrder table.

Updating Systems
The Systems table in InsightStaging controls which source systems are included in the Analytics ETL
process. For this reason, it is essential that we configure an entry for the Budget to ensure that data is
extracted from this source system, then transformed and loaded into the InsightWarehouse database.

Querying the content of Systems as in Figure 71, users will see that an entry for Budget already exists.
However the Enabled_ column is set to 0 for this record, i.e. the processing of Budget within Analytics ETL
is disabled.

Page 312 | 335


Advanced Analytics Platform Technical Guide

Figure 71 – SQL Query to check the content of Systems

If using MS SSMS to customize Systems, a simple UPDATE statement can be used to enable the Budget
source system, as highlighted in the following picture.

Figure 72 – Sample of update statement to enable ETL processing for Budget data

Once the UPDATE statement has been executed, it is good practice to rerun a query on Systems to ensure
that the Budget entry was edited correctly.

Figure 73 – SQL Query to check updates in Systems

v_Source Views
The Advanced Analytics Platform’s ModelBank includes an out-of-the-box v_source view used to map
Budget source columns against target GL-related tables and columns in the InsightWarehouse database.
This out-of-the-box v_source view for Budget is called v_sourceGLBudget and users can check out the
script of this view by opening in in ALTER mode, using a new query window, as shown in Figure 74.

Page 313 | 335


Advanced Analytics Platform Technical Guide

Figure 74 – v_source view for Budget in InsightStaging

Important note: in case the out-of-the-box Budget v_source view requires to map
updates or does not include some important columns of the Budget source system,
users can create a new locally developed v_source view. This will also impact the
InsightStaging..UpdateOrder table (that will have to be updated to enable ETL
processing of the new view) and the InsightWarehouse..DataDictionary table (that
will be updated to include the new columns in the existing GL and GLConsodlidated
objects). Please refer to the Adding a New Column to the InsightWarehouse database
with SQL Script section for further information about this process.

Executing Analytics ETL


Users need to finalize the import of data from the Budget source system in the InsightWarehouse. This can
be carried out by the Analytics ETL job.

Page 314 | 335


Advanced Analytics Platform Technical Guide

If data for the current business date has already been imported from Temenos Core Banking and users
just need to run an import for Budget data, they do not need to execute a full Analytics ETL but can start
from the InsightLanding CSI BS Update.

Post-Update Checks
The last step of the process is to ensure that the new table has been imported as expected in the Analytics
Platform.

InsightWarehouse
Budget entries will be stored in the GL and GLConsolidated-related tables and views. In order to ensure
that these entries have correctly been added and populated users can execute the following queries on the
v_GLConsolidated view in the InsightWarehouse database.

First, users can filter those records for which the GLBudgetAmount column has been populated, i.e. the
Budget entries.

Figure 75 – Query on InsightWarehouse.dbo.v_GLConsolidated that filters Budget data

Secondly, users can run a query on all the GLConsolidated entries for the same GL Account (e.g. for an
account with GLNum = MBPL.0030) as shown in Figure 76.

Page 315 | 335


Advanced Analytics Platform Technical Guide

Figure 76 – Query on InsightWarehouse.dbo.v_GLConsolidated that filters on a specific account

This query above will return a list of records from Temenos Core Banking and only one record from the
Budget source system (i.e. the record whose GLBudgetAmount column is not null) – the latter will define
the total monthly budget amount for GL Account considered by the query.

GL Reports in the Analytics Web Front End


Users should also check the content of the budget reports in the Analytics Application to ensure that they
reflect the data that was loaded into the Insight Warehouse.
Figure 77 displays an example of Budget report, i.e. the out-of-the-box Balance Sheet – Budget report,
which is part of the Financial Analytics Content Pack.

Page 316 | 335


Advanced Analytics Platform Technical Guide

Figure 77 – Budget data shown in Balance Sheet – Budget report

Adding a New Column from a Source System or from InsightETL to


the Data Warehouse with SQL Script (Fact Column)
This document demonstrates how to include a new column in either a Fact or a Dim table of the
InsightWarehouse database, within the Advanced Analytics Platform.

When a new column is added to the Advanced Analytics Platform, it can be either directly imported from a
Source System or it can be calculated within the platform itself. In case the column is calculated, its value
can be either the result of a split/calculation defined in the AttributeCalculations table of InsightETL or of
a business rule also stored in InsightETL but defined through the Data Manager feature.

Columns defined through AttributeCalculations are normally added to a copy of a source system table either
in the InsightImport database (and the whole table is then copied to InsightLanding and to InsightSource
through Analytics ETL) or they are directly added in the InsightSource database. In either case, they will
be displayed as standard columns of a source system table in InsightSource, right before being extracted,
transformed in InsightStaging and loaded into InsightWarehouse like standard Source System columns.
Splits and calculations can be also added to other databases but additional customization would be required
in this case.

Columns defined through Data Manager’s business rules, instead, can be applied by default to tables
residing in the InsightLanding, InsightSource and InsightStaging databases or to abstraction views in
InsightWarehouse. The types of rules that can be designed through Data Manager are banding, calculation,
Custom Table, Dataset and Lookup. These rules will be directly applied to the appropriate database through
dedicated steps in Analytics ETL.

Page 317 | 335


Advanced Analytics Platform Technical Guide

The configuration of columns directly extracted from a Source System is very similar to the configuration
of columns defined through InsightETL..AttributeCalculations and users will group process these two kind
of columns under the same way. A separated section, instead, will be dedicated to Data Manager business
rules and to the columns resulting from them.

The configurations illustrated in this section will be relying on the tools of MS SQL Server Management
Studio but new columns can be added to InsightWarehouse also through a source code management
system like MS Team Foundation Server.

Important note: new columns’ names should be less than 15 characters long and
comply with Analytics and SQL naming conventions.
As previously mentioned, the instructions in this section apply both to columns directly imported from a
Source System and to columns obtained through a business rule defined in InsightETL through
AttributesCalculations. To illustrate these two scenarios, the examples used will be the ANNUAL_BONUS
column and the ETL_ANNUAL_BONUS column. The former is a column we can find in the CUSTOMER table
in InsightImport and it contains data from the ANNUAL.BONUS field in CUSTOMER, a Temenos Core
Banking table. The latter is a column whose value is obtained from a calculation defined in
InsightETL..AttributeCalculations and that will be added to the CUSTOMER table in InsightSource (that
would apply also to business rules executed in InsightImport or InsightLanding, as long as they are reflected
in InsightSource), as shown in the prevous sections.

Either ANNUAL_BONUS or ETL_ANNUAL_BUONUS can be added as a new column called AnnualBonus to


the FactCustomer table of the InsightWarehouse database. The AnnualBonus column will be populated
based on the value of the ETL_ANNUAL_BONUS calculated column – however we will also illustrate how
the configuration changes if we decide to map AnnualBonus against the ANNUAL_BONUS column, directly
imported from the CUSTOMER table in Temenos Core Banking.

Pre-Update checks
Before users start working on importing our new column, they should ensure that it is available in the
InsightSource database. In both the examples considered, the columns to be imported belong to the
CUSTOMER table and Figure 78 shows two queries that can be used to ensure that either ANNUAL.BONUS
or ETL_ANNUAL_BONUS have been stored into InsightSource correctly.

Page 318 | 335


Advanced Analytics Platform Technical Guide

Figure 78 - Samples of query on the column in InsightSource

The second check users need to perform is on the DataDictionary table in the InsightWarehouse database,
which contains a definition for all columns of data tables or views in InsightWarehouse.

As previously explained, the new column to be added is called AnnualBonus and it could be mapped either
against a column directly imported from the source system (e.g. ANNUAL_BONUS) or against a calculated
column (e.g. ETL_ANNUAL_BONUS). Users need to ensure that a definition for AnnualBonus does not exist
already in DataDictionary to avoid duplicates and they can use the following query to do so.

Figure 79 - Samples of query to check for duplicates in InsightWarehouse

Configuring InsightStaging
Users should also configure the InsightStaging database so that the newly added column is populated
correctly when the Analytics ETL flow is executed. Firstly, users will have to create a new v_source view to
map the new column. Afterwards, they will need to create new entries within the UpdateOrder table so
that this new v_source view is taken in consideration within the dataflow of the Analytics ETL agent job.

Please note that these two steps are only required if no locally configured v_source exist for a specific
object (e.g. Customer). Otherwise, new columns for the object can just be added to the existing local
v_source view and no new UpdateOrder entry will be needed.

Creating a locally configured v_source view for the target object


The column used as an example in this document is called AnnualBonus and it belongs to the Customer
object, i.e. it should be included either in the DimCustomer or in the FactCustomer table of the
InsightWarehouse database in our Advanced Analytics Platform. In order to populate AnnualBonus, users
need a v_source view for the Customer object within the Banking System that maps AnnualBonus against
a column stored in InsightSource – the column in InsightSource can either be directly extracted from a
Source System or calculated through InsightETL. Also, to obtain this v_source view,users have a number
of options: users can create, from scratch, a view that only contains the AnnualBonus column we would
like to map; or they can modify the script of the v_sourceCustomerBS view, that already exists out-of-the-
box, by including a line of code that maps AnnualBonus to its source; or they can create a brand new
v_source view that replaces v_sourceCustomerBS and also maps any locally configured Customer column,

Page 319 | 335


Advanced Analytics Platform Technical Guide

then enable the latter and disable the former. This section will illustrate the third option as this one allows
to both develop locally configured v_source views and to keep a backup of the original v_source views
provided out-of-the-box by Temenos.

The locally configured v_source view will be called v_sourceCustomerBS_BNK and it can be designed using
the script of the v_sourceCustomerBS view as a basis.

To copy the script of v_sourceCustomerBS into the new view, users can open MS SQL Server Management
Studio, then right-click on the v_sourceCustomerBS and select the option Script View As > CREATE To >
New Query Editor Window, as shown in Figure 80.

Figure 80 – Altering a v_source view in InsightStaging

Once the query has been opened in a new window, users should modify the view name to
v_sourceCustomerBS_BNK, as shown in Figure 81.

Figure 81 – Creating a locally developed v_source view in InsightStaging

Page 320 | 335


Advanced Analytics Platform Technical Guide

Configuration for Columns resulting from InsightETL rules (designed through Data
Manger or AttributeCalculations)
Users should then append a new local configuration section, like the following, that includes the source-to-
target mapping for the new column. In the example shown in Figure 82, the ETL_ANNUAL_BONUS column
designed through AttributeCalculations is used but the mapping for a source system column like
ANNUAL_BONUS would be the same as shown in Figure 83.

The same ‘Local Changes’ (or ‘Local Configuration’) section can also be used, in a later stage, for other
locally developed Customer columns, if needed.

Figure 82 – Adding locally developed columns to locally developed v_source view in InsightStaging

Configuration for Columns directly extracted from a Source System


The mapping process is the same for both columns directly extracted from a source system and columns
calculated in InsightETL. Figure 83 shows us how AnnualBonus can be mapped against the ANNUAL_BONUS
column, directly extracted from the CUSTOMER table in Temenos Core Banking (then imported to the
InsightSource database as BS.CUSTOMER table).

Figure 83 – Adding source system columns to locally developed v_source view in InsightStaging

Page 321 | 335


Advanced Analytics Platform Technical Guide

Once users have applied the changes highlighted above, they can execute the script – this will both check
for errors and generate the new v_sourceCustomerBS_BNK view under InsightStaging>Views.

Source view validation


Users should validate that the new local source view returns records. Figure 84 shows a sample of query
used for this kind of validation – this SELECT statement will only work if business data is available in
InsightSource.

Figure 84 – Sample of query to validate v_source view in InsightStaging

Creating new UpdateOrder entries


As users have created a new locally configured v_source view, they will have to add some new entries to
the UpdateOrder table to ensure that the mapping defined in the new v_source view is considered in the
Analytics ETL dataflow. In addition to this, users should disable the existing entries associated with the out-
of-the-box v_sourceCustomerBS view that should be ignored during Analytics ETL.

To enable the locally developed v_source view, users can execute an INSERT INTO statement similar to
the one shown in Figure 85.

Figure 85 – Sample of insert statement to enable the ETL processing of the locally developed v_source view in
InsightStaging..UpdateOrder

Disabling the UpdateOrder entries for old v_source view


To disable the out-of-the-box v_sourceCustomerBS, users should mark the value of the Exclude column to
1 for the UpdateOrder entries associated with the out-of-the-box v_sourceCustomerBS, using a statement
similar to the one shown in Figure 86.

Page 322 | 335


Advanced Analytics Platform Technical Guide

Figure 86 – Sample of update statement to disable the ETL processing of the out-of-the-box v_source view in
InsightStaging..UpdateOrder

Update order validation


User should run a query to ensure that the UpdateOrder table was edited correctly, similar to the one
shown in Figure 87. If the results of the query matches with the changes applied on the UpdateOrder-
LocalConfiguration.xml file, users can proceed to the next step.

Figure 87 – Sample of query to check updates in InsightStaging..UpdateOrder

Configuring InsightWarehouse
Lastly, users have to update the DataDictionary table in InsightWarehouse. DataDictionary stores the
definitions for all columns in the Dim, Fact and Bridge tables and for all the abstraction and data source
views (with schema Cubes) within the InsightWarehouse database. Therefore, if users want to add a new
column to the Warehouse, they need to include its definition within DataDictionary.

As previously discussed, AnnualBonus belongs to the Customer object, i.e. it should be included either in
the DimCustomer or in the FactCustomer table of the InsightWarehouse database in our Advanced Analytics
Platform. In general, both columns directly extracted from a source system and columns resulting from
InsightETL rules can be added either as Dim or as Fact columns. Whether a column should be set up as a
Fact or as a Dimension is debatable and depends on how often the value of such column is likely to change
and on the preferences of the bank requesting this local configuration column to be set up. In this
document, the AnnualBonus will be set up as a Fact column to provide an example of how Fact definitions
should be defined in DataDictionary. An example of Dim definition will be also provided in this document.

Creating new DataDictionary definitions for Facts


Figure 88 shows an example of INSERT statement used to add a new Fact column definition.

Page 323 | 335


Advanced Analytics Platform Technical Guide

Figure 88 – Sample of insert statement to add a new Dim definition in InsightWarehouse..DataDictionary

As shown in Figure 88, the INSERT INTO statement will set up three new entries. These are the definition
AnnualBonus as a new column in the FactCustomer table, the definition of AnnualBonus as a new column
in the v_Customer view (that is part of the abstraction layer of InsightWarehouse) and the definition of
AnnualBonus as a new column in the v_CubeFactCustomer view (this is also part of the abstraction layer
of InsightWarehouse used for the update of the SSAS Cubes Measures). This means that the AnnualBonus
new column will not only be available in the Warehouse but it can also be exposed to Reports and SSAS
Cubes.

Checking duplicates and adding Column in Table and Views


In order to perform duplicate checks in DataDictionary and also to add the new column to the FactCustomer
table and to the v_Customer and v_CubeFactCustomer views, users have to execute the
s_DDCombinedRecords_update, the s_TableStructureFromDD_update and the
s_ViewStructureFromDD_update stored procedures in the InsightWarehouse database.

It is required to execute the routines above in order to update any Dim, Fact or Bridge table and any
abstraction or data source views in InsightWarehouse, whenever a column definition is added in
DataDictionary.

InsightWarehouse Checks
Users should check that the new entries were correctly included in the DataDictionary table by running a
query similar to the one shown in Figure 89.

Figure 89 – Sample of query that checks locally developed definitions in InsightWarehouse..DataDictionary

In addition to the three new manually created entries in DataDictionary, that had the Configuration value
set to ‘Local’, the query output will show an extra definition for the AnnualBonus column, with Configuration
set to ‘Combined Configuration’. This was generated by the s_DDCombinedRecords_update stored
procedure.

To ensure that the structure of the FactCustomer table has been amended correctly by the
s_TableStructureFromDD_update stored procedure, users can run a query on the FactCustomer table as
shown in Figure 90.

Page 324 | 335


Advanced Analytics Platform Technical Guide

Figure 90 – Sample of query that checks if locally developed fields were added to InsightWarehouse’s table

The query output shows that the AnnualBonus column has been correctly included in the FactCustomer
table but it has not been populated yet – the reason behind this is that Analytics ETL has not run after the
DataDictionary update.

The same applies to the v_Customer and the v_CubeFactCustomer views – if the
s_ViewStructureFromDD_update stored procedure has been executed correctly, the AnnualBonus column
will be added to these two views but not populated yet, as shown in Figure 91.

Figure 91 – Sample of queries that checks if locally developed fields were added to InsightWarehouse’s views

Executing Analytics ETL


Users will now populate the new AnnualBonus column that was added to the InsightWarehouse database.
This is done by executing the Analytics ETL process in the Advanced Analytics Platform. Users can either
execute this agent job from step 1 or, if the Analytics ETL has already been executed for the current date
and only the new field has to be populated, we can start from the “Insight Attribute Calculations – Source”
step.
Page 325 | 335
Advanced Analytics Platform Technical Guide

Once Analytics ETL has been executed correctly, we can check in the Annual Bonus column has been
populated correctly in the FactCustomer table and in the v_Customer and v_CubeFactCustomer views.

Post-Update Checks
Figure 92 shows samples of queries that can be used to ensure that values for the AnnualBonus column
were populated correctly across the target table and views in the InsightWarehouse database. The WHERE
clause for all the SELECT statements below is filtering results based on the current business date as the
AnnualBonus column has not been imported for any past date.

Figure 92 – Sample of queries that checks if locally developed fields were populated correctly in InsightWarehouse’s
tables and views

Page 326 | 335


Advanced Analytics Platform Technical Guide

Adding a New Column from a business rule in Data Manager to the


Data Warehouse with SQL Script (Dim Column)
This section will illustrate how to add a new column to Data Warehouse whose value is figured out through
a business rule defined through Data Manager.

This is a feature of the Analytics web front end that will update the Rules Engine’s tables stored in the
InsightETL database. Data Manager is used to define different types of business rules that, for example,
lookup InsightWarehouse columns (e.g. a distinct list of Product Codes is mapped to a table containing
Classification and Category), create banding columns (e.g. the AgeGroup column based on the value of the
column Age), assign specific values to a column through SQL functions or coding etc. For the Lookup and
Banding rules set up in Data Manager, we also need to specify the values to be mapped and the available
bands to be used, respectively (for more information about the available rules types in Data Manager,
please refer to the Analytics Application Web Front End User Guide).

Furthermore, Data Manager allows the rules designer to set up business rules that will be applied, during
Analytics ETL, to tables in the InsightLanding, InsightSource and InsightStaging databases. Through this
facility, we can also update the abstraction views in the InsightWarehouse database, even though it is not
possible to apply business rules directly to Bridge, Dim and Fact tables in this database. If users want to
apply Data Manager’s business rules to InsightWarehouse tables, then, they should apply these rules to
InsightStaging and then map the new fields resulting from them to InsightWarehouse columns and, in case
of new columns, users will have to configure them in the Warehouse.

During Analytics ETL, the business rules defined through Data Manager will be processed and applied to
the appropriate database tables (i.e. the temporary tables in the InsightStaging database, in the example
considered in this section). If users want these values to be loaded into the InsightWarehouse database,
they will have to create a corresponding definition in the DataDictionary table.

This section will use the AnnualBonusGroup column to demonstrate how to carry out the aforementioned
configuration process. The value of the new AnnualBonusGroup column will be determined by banding rule
based on the value of the previously added AnnualBonus column. This section will also demonstrate the
set up of a Dim column.

Pre-Update checks
Before configuring the new AnnualBonusGroup column, users need to ensure that the column on which the
grouping rule will depend upon, i.e. AnnualBonus, is correctly imported in the Advanced Analytics Platform.
If the source column which works as a basis for our business rule is part of the ModelBank or Framework
or Country ModelBank configuration, the source column will be available by default in the Advanced
Analytics Platform. If, like in this case, the source column is part of the Local Configuration Layer, we will
have to manually configure its import in the Advanced Analytics Platform. This process is described in detail
in the Adding a New Column from a Source System or from InsightETL to the Data Warehouse with SQL
Script (Fact Column) section.
The second check that needs to be performed is on the DataDictionary table in the InsightWarehouse
database that contains a definition for all columns of data tables or views in InsightWarehouse. If no
AnnualBonusGroup column definition exists, as shown in Figure 93, users will have to create it.

Page 327 | 335


Advanced Analytics Platform Technical Guide

Figure 93 - Samples of query to check for duplicates in InsightWarehouse

The last check required is to ensure that the new rule is not already available in Data Manager to avoid
duplicates. First, we should log onto the Analytics web front-end application, open the System Menu and
select the Data Manager option.

Once we have opened the Data Manager screen, we can use the search box on the top left hand-side of
the screen to query if a rule with a name similar to the one we are about to create, as shown in Figure 94.

Figure 94 - Samples of duplicates check in Data Manager

In this sample, the query returns no data – this means that no Annual Bonus-related rule exists.

Create Data Manager Business Rule


Once the pre-update checks are completed, users should input the new AnnualBonusGroup banding rule
through the Data Manager facility.

Remaining on the Data Manager screen, users should clear the criteria inputted in the search box then click
on the InsightStaging database on the Data Manager menu. This will ensure that the list of existing Data
Manager rules for this database are displayed in the centre of the screen and also it will make sure that
the new rule to be created is also added to InsightStaging. Then, users should click on the “New” button
on the left hand-side of the screen as shown in Figure 95.

Page 328 | 335


Advanced Analytics Platform Technical Guide

Figure 95 – Adding a new rule to InsightStaging in Data Manager

This will bring up the Add new rule screen. This screen is divided into 4 tabs and users should fill in the
fields in the Rule General section as shown in Figure 96.

Figure 96 – Adding a new rule screen – Definition – Rule General tab

On the Source Data tab, users should type in the name of the StagingCustomer table as a SourceTable
then select the AnnualBonus column from the StagingCustomer table.

Page 329 | 335


Advanced Analytics Platform Technical Guide

Figure 97 – Adding a new rule screen – Definition – Source Data tab

On the Custom Data tab, users should add the AnnualBonusGroup as a custom field.

Figure 98 – Adding a new rule screen – Definition – Custom Data tab

On the Execution tab, users can leave the default values set for the Execution Phase field, i.e. Extract, and
for the Execution Step, i.e. 1. These two parameters control when the current business rule will be executed
within the core part of Analytics ETL so, in case we are setting up a business rule that is dependent on
another rule, the parameters on the Execution tab should be updated accordingly. E.g. if the current
business rule were dependent on a parent rule with Execution Phase = Extract and Execution Step =1, the
child rule should have Execution Phase = Extract (or more) and Execution Step = 2 (or more). Once the
Execution tab has been filled in, users should save the record.
Page 330 | 335
Advanced Analytics Platform Technical Guide

Figure 99 – Adding a new rule screen – Definition – Execution tab

As soon as we hit Save, the Data Mappings tab will appear. Users can click on this tab and click on the Plus
button to define the banding groups for Annual Bonus Group and also the mapping between these groups
and values in Annual Bonus. These bands and mapping should be defined as shown below.

Figure 100 – Adding a new rule screen – Data Mappings

Page 331 | 335


Advanced Analytics Platform Technical Guide

Configuring InsightWarehouse
We have to add new definitions for the AnnualBonusGroup column in the DataDictionary table in
InsightWarehouse. As we know, DataDictionary stores the definitions for all columns in the Dim, Fact and
Bridge tables and for all the abstraction and Cubes views within the InsightWarehouse database. Therefore,
users should add a new entry in DataDictionary for each table or view where the column AnnualBonusGroup
should appear.

In the previous section, the AnnualBonus was classified as a Fact column. The new AnnualBonusGroup
column, instead, will be classified as a Dimension to provide an example of how Dim definitions should be
defined in DataDictionary.

Creating new DataDictionary definitions for Dimensions


Figure 101 shows a sample of INSERT statement used to add a new Dim column definition in MS SSMS.

Figure 101 – Sample of insert statement to add a new Fact definition in InsightWarehouse..DataDictionary

The sample above includes three new entries. First, the definition AnnualBonusGroup as a new column in
the DimCustomer table; then, the definition of AnnualBonusGroup as a new column in the v_Customer view
(which is part of the abstraction layer of InsightWarehouse); and, finally, the definition of
AnnualBonusGroup as a new column in the v_CubeDimCustomer abstraction view, used for the update of
the SSAS Cubes Measures. This means that the AnnualBonusGroup new column will not only be available
in the Warehouse but also exposed to Reports and SSAS Cubes.

If the three entries above are compared against the previously inputted definitions for the AnnualBonus
Dim column, one important difference appears in the structure of the SQL statements used: when users
define a new column for a Dim table, they will also have to specify a value for SCDType in the DimCustomer-
specific record. This parameter defines how dimension changes should be handled by InsightWarehouse
and this parameter should be always set to NULL in Fact column definitions – please refer to the Advanced
Analytics Platform Technical Guide.

If the INSERT statement runs without any errors, users can move to the next step.

Checking Duplicate and updating Column in Table and Views


In order to perform duplicate checks in DataDictionary and also to add the new column to the DimCustomer
table and to the v_Customer and v_CubeDimCustomer views, users should execute the
s_DDCombinedRecords_update, the s_TableStructureFromDD_update and the
s_ViewStructureFromDD_update stored procedures in the InsightWarehouse database.

It is required to execute the routines above in order to update any Dim, Fact or Bridge table and any
abstraction or data source views in InsightWarehouse, whenever a column definition is added in
DataDictionary.

InsightWarehouse Checks
Users should check that the new entries were correctly included in the DataDictionary table by running a
query similar to the one shown in Figure 102

Page 332 | 335


Advanced Analytics Platform Technical Guide

Figure 102 – Sample of query that checks locally developed definitions in InsightWarehouse..DataDictionary

It should be noted that, in addition to the three new entries we defined in DataDictionary, that had the
Configuration value set to ‘Local’, there is also an extra definition for the AnnualBonusGroup column, with
Configuration set to ‘Combined Configuration’. This was generated by the s_DDCombinedRecords_update
stored procedure.

To ensure that the structure of the DimCustomer table has been amended correctly by the
s_TableStructureFromDD_update stored procedure, users can run a query on the DimCustomer table as
shown in Figure 103.

Figure 103 – Sample of query that checks if locally developed fields were added to InsightWarehouse’s table

The query output shows that the AnnualBonusGroup column has been correctly included in the
DimCustomer table but it has not been populated yet as we haven’t run Analytics ETL.

The same applies to the v_Customer and the v_CubeDimCustomer views – if the
s_ViewStructureFromDD_update stored procedure has been executed correctly at the end of the publishing
process, the AnnualBonus column will be added to these two views but not yet populated, as shown in
Figure 104.

Page 333 | 335


Advanced Analytics Platform Technical Guide

Figure 104 – Sample of queries that checks if locally developed fields were added to InsightWarehouse’s views

Executing Analytics ETL


Users will now populate the new AnnualBonus column that was added to the InsightWarehouse database.
This is done by executing the Analytics ETL process in the Advanced Analytics Platform. Users can either
execute this agent job from step 1 or, if the Analytics ETL has already been executed for the current date
and only the new field has to be populated, we can start from the “Insight Attribute Calculations – Source”
step.

Once Analytics ETL has been executed correctly, we can check in the Annual Bonus column has been
populated correctly in the DimCustomer table and in the v_Customer and v_CubeDimCustomer views.

Post-Update Checks
Figure 105 shows samples of queries that can be used to ensure that values for the AnnualBonusGroup
column were populated correctly across the target table and views in the InsightWarehouse database. The
WHERE clause for all the SELECT statements below is filtering results based on the current business date
as the AnnualBonusGroup column has not been imported for any past date.

Page 334 | 335


Advanced Analytics Platform Technical Guide

Figure 105 – Sample of queries that checks if locally developed fields were populated correctly in InsightWarehouse’s
tables and views

Page 335 | 335

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy