0% found this document useful (0 votes)
41 views6 pages

Srikanth - Resume - 2024 - SF

Uploaded by

venkat.etl14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views6 pages

Srikanth - Resume - 2024 - SF

Uploaded by

venkat.etl14
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Srikanth P

Sr Informatica Developer
Email: Srikanthp.dwh@gmail.com
Mobile No: (+91)-8698207492

Professional Summary:
 Over all 12 years of experience in the IT industry in Design, Development and
implementation as an ETL Informatica Developer.
 Hands on experience on Informatica PC, IICS, B2B DT Studio, PWX , Power center and
Snowflake.
 Experienced in Data Analysis, Data modeling, ETL, Data Warehousing, Reporting,
Development, Maintenance, Testing and Documentation.
 Strong skills of Informatica Data Quality – Address Validator, Match, merge, Key Generator,
Case Convertor, Standardizer, Labeler, Parser Transformations, Human Task, Bad record
Exception, Workflow manager, Application, DQ Rule Creation, Mapplet and Mapping.
 Hands on experience in column level profiling, primary key analysis, join profiling, and rule
sanitation for all the source systems.
 Hands on experience in Creating Classification model, Reference table, PDO and LDO on
Source system.
 Worked on Address validator transformation in IDQ and passed the partial address and
populated the full address in the target table
 Hands on experience in building Scorecard and trend chart in Informatica Analyst for
measuring the quality.
 Analyze the source data and provide the basic profiling details to the client for decision
making.
 Hands on experience in building data lineage reports and running data quality rules to check
for the quality.
 Worked on Informatica PowerCenter complex mappings using different transformations like
SQL Transformation, Data Exchange (DX) and Unconnected Data Transformation (UDT).
 Hands on experience using the different data sources like Oracle, Teradata, Greenplum, Ms
SQL server, flat files, EBCIDIC Files and XML files.
 Implemented type1/type2/type3, Incremental and CDC logic according to the Business
requirements.
 Experience in debugging mappings, Identified bugs in existing mappings by analyzing the
data flow and evaluating transformations.
 Hands on experience in Performance Tuning of sources, targets, transformations and
sessions.
 Have been involved in Test Deployment of Informatica objects using Informatica deployment
groups, and non Informatica objects using Octopus and UNIX deployment process.
 Natural ability to pull out exceptionally good result in unfavorable conditions.
 I have scrum experience with agile methodology.

Technical Exposure:
• Working as a Lead Technical Specialist for HcL, Hyderabad from DEC 2021 - Till Date.
• Worked as a Technical Specialist-1 for CitiusTech Healthcare Technology Private
Limited, Hyderabad from Jan 2021 – Dec 2021.
• Worked as an Advisor for DELL International Services, Hyderabad since April 2020 – Jan
2021.

Internal Use - Confidential


• Worked as a Senior Consultant with Capgemini India Private Ltd, since Apr 2013 to Mar
2020.
• Worked as an ETL Developer at Netlink Software Private Ltd, since Dec2011 to Mar 2013.

Technical Exposure:

Data Warehousing Tools IICS,MFT,Informatica power center, data quality, PWX, DT Studio and Stream
sets
Oracle, Snowflake, XML, Greenplum, Flat files, and Excel files.
Databases/RDBMS/Others

Data Modeling ERWIN, Star Schema, Snow Flake Schema, Physical and Logical Modeling

Programming Languages PL/SQL and UNIX

Job Control and Other Tools Tivoli, Control + M, Putty, SQL*Plus and Developer tool

Project #5
Client: Axalta 12/2021 – Till Date
Role: ETL Developer

Description:
Axalta Coating Systems is a leading global coatings company dedicated solely to the development,
manufacture and sale of liquid and powder coatings. We provide a range of performance and
transportation coatings for manufacturers of light and commercial vehicles, the refinish aftermarket
and for many industrial applications.

Responsibilities:
• Understood requirements from business analyst/ Architects and frame them into
requirement document
• Prepared technical documentations of system design, implementation, extraction, coding
information and troubleshooting
• Responsible for Analysis and developing Extraction, Transformation and loading process
using ETL tool Informatica
• To run the process end to end we need all sources. All flat files and data from SAP BW.
• Worked on various other application on Power center like AITRA, ARIBA, and E-COMMERS.
• Using WinSCP tool to place the source file or any file or the server.
• Worked on IICS tool to integrate data with multiple source system like ERP system (SAP R/3,
SAP S4-HANA) and E-Commerce platforms (Salesforce) both on IICS and PowerCenter.
• Involved with the clients/business to understand the business needs and coordinated with
the team to achieve the needs of the business.
Involved in Unit and Integration testing of Informatica Sessions.
• Prepared and used test data cases to verify accuracy and completeness of B2B/ETL process.
Environment: Informatica 10.x,B2B DX, IICS, SQL server, snowflake, Unix and MFT.

Project #4

Internal Use - Confidential


Client: Daman 02/2021 – 12/2021
Role: ETL Developer

Description:
Daman is the UAE’s leading health insurer and trusted Government partner, offering leading
solutions that provide access to unparalleled healthcare services. Daman need to create centralized
Data warehouse (Name called DS360) for business analysis. In this warehouse Daman wants to load
all the data including health insurance modules like claims and policy etc. It is one place to get all
the information about individual customers and agents.

Responsibilities:
• Understood requirements from business analyst/ Architects and frame them into
requirement document
• Prepared technical documentations of system design, implementation, extraction, coding
information and troubleshooting
• Responsible for Analysis and developing Extraction, Transformation and loading process
using ETL tool Informatica
• Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor
• Worked on Informatica data quality tools- content set, mapplet, mapping, work flow
• Worked in Agile methodology and involved in Sprint planning and review meetings
• Created sessions, configured workflows to extract data from various sources, transformed
data, and loading into SQL server.
• Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables
• Responsible for Development, SIT and Production Environment Migration Processes
• Performed high-level smoke/sanity check prior validation phase
• Executed unit/validation test cases/test scripts
Environment: Informatica 10.x, IDQ, SQL server, Oracle, Unix and Control-M.

Project #3
Client: Dell Financial Services 4/2020 – 01/2021
Role: ETL Developer

Description:
Dell Technologies is an American multinational information technology corporation based in Round
Rock, Texas. It was formed as a result of the acquisition of Dell and EMC Corporation which later
became Dell EMC. Dell's products include personal computers, servers, smartphones, televisions,
computer software, computer and network security, as well as information security services.
Global Operations and Supply Chain includes several sub projects such as GTM, ABAO and
MT Bonnell which mainly use given day to be saved in order to meet demand at a later date. This
gives Dell the flexibility to plan around future days where we think demand will spike or days that we
believe may have a part shortage. An important distinction is that allocations aren't necessarily just
commodities and how often the product deviates from what was ordered due to part substitutions
and other things.

Responsibilities:
• Responsible for Analysis and developing Extraction, Transformation and loading process
using ETL tool Informatica.

Internal Use - Confidential



Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow
Manager, and Workflow Monitor.
• Worked on Teradata macros and GP functions for data processing.
• Worked in Agile methodology and involved in Sprint planning and review meetings.
• Created pipelines using streamsets and Azure data factory.Developed mapping parameters
and variables to support SQL override.
• Created sessions, configured workflows to extract data from various sources, transformed
data, and loading into TD and GP Tables.
• Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
• Modify/optimize or provide solution driven approach to existing models for better
performance
• Responsible for the preparation of run books for projects developed and hand over the
transition to L1/L2 teams.
• Responsible for Development, SIT and Production Environment Migration Processes.
• Create Automation for manual jobs and reduce the load time on the system.
Environment: Informatica 10.x, Teradata, Green plum, Control-M, Stream sets, Kafka.

Project #2
Client: Barclays, SA. 4/2013 – 03/2020
Role: ETL Developer

Description:

ABSA and Barclays embarking on a journey to enhance and replace their current ETL and Data
warehouse environments. This will entail a new implementation of Informatica which will replace
the current ETL tool sets such as Abinitio, Oracle pure extract and Oracle Warehouse Builder. The
current Data Warehouses are built on Oracle and will be replaced with Teradata. The new Data
Warehouse environment will be based on the new data model, data standards, data governance
structure and operating model.
Responsibilities:

• Performed requirement gathering analysis design development testing implementation


support and maintenance phases.
• Parsed high-level design specification to simple ETL coding and mapping standards.
• Created mapping documents to outline data flow from sources to targets.
• Analyze the source data and provide the profiling details to the client for decision making.
• Create the scorecard on source data in IDQ Analyst.
• Implement necessary DQ rules in IDQ Analyst while profiling the data.
• Extensively worked on various Objects Parse, Mapper and Serializer in DT Studio for AnP
frame work.
• Developed parser code in DT Studio for field validation and file validation.
• Extracted the data from the flat files and other RDBMS databases into staging area and
populated onto Data warehouse.
• Used various transformations like Match, merge, Key Generator, Case Convertor,
Standardizer, Labeler, Parser Transformations, Human Task, Bad record Exception,
Workflow manager, Application, DQ Rule Creation, Mapplet and Mapping
• Used Address validator transformation in IDQ and passed the partial address and populated
the full address in the target table

Internal Use - Confidential


• Extensively used SQL Transformation, Java, DataExchange and Unconnected Data
Transformation as part of AnP framework.
• Worked on different tasks in Workflows like sessions, event wait, decision, e-mail,
command, worklets, Assignment and scheduling of the workflow.
• Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
• Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate
data, fixing the bad data, fixing NULL values.
• Configure the B2B Setup into the Project and migrate B2B setup to one environment to
another environment.
• Wrote UNIX shell Scripts for AnP frameworkand PMCMD commands for run workflows.
• Involved in Performance tuning at source, target, mappings, sessions, and system levels.
• Prepared migration document to move the mappings from development to testing and then
to production repositories.

Environment: Informatica Power Center 10.2, Informatica IDQ 10.2, PWX, Teradata V14, Oracle 10g,
SQL Developer, Teradata, UNIX Shell Scripting, Flat Files, XML Files, Agile Methodology and Tivoli .

Project #1
Client: KATZ Rexall group, CANADA 08/2011– 02/2012.
Role: ETL Developer

Description:

KATZ group is one of big drugstore chain in Canada, It is Rexall group under which they sell their
Pharmacy products. KATZ has an existing Data Warehouse for Rexall which is running from past 6
years. We are redesigning the existing Data Warehouse in a way so that we can store more data in a
more appropriate way and can easily pull out keen effective information from the new environment
which missed in old system. New architecture is much reliable and user friendly, specifically
distinguish between dimensions and facts. Lots of new Data Warehouse concept has been added in
the RE-Arch like SCD and IDS etc. New architecture increased reliability on data and the business will
be benefited by these efforts. New environment explore more dimension which provide existing
data more sense, now data is more expressible than ever before.
Responsibilities:

• Analyze business requirements, technical specification, source repositories and physical


data models for ETL mapping & process flow.
• Designed and developed complex mappings by using Lookup, Expression, Update,
Sequence
• Aggregator, Router, Stored Procedure, etc., transformations to implement complex logics
while coding a mapping.
• Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor
and Repository Manager.
• Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract
the data from multiple source systems like Oracle, SQL server and Flat files and loaded into
Greenplum.
• Developed Informatica Workflows and sessions associated with the mappings using
Workflow Manager.
• Experience in Performance tuning of Informatica (sources, mappings, targets and sessions)
and tuning the SQL queries.

Internal Use - Confidential


• Automating the ETL process through scheduling the exception-handing routines as well as
source to target mapping development, support and maintenance.
• Involved in debugging Informatica mappings, testing of Stored Procedures and Functions,
Performance and Unit testing of Informatica Sessions, Batches and Target Data.
• Developed Mapplets, Reusable Transformations, Source and Target definitions, mappings
using Informatica.
• Involved in Performance Tuning of mappings in Informatica.
• Good understanding of source to target data mapping and Business rules associated with
the ETL processes.
• Understand PL/SQL script and develop the mapping accordingly.
• Used different transformations such as Join, Filter, Lookup, and Aggregator using
Informatica in ETL to transform the data.
• Preparation of test data for regression testing and validating the target data with the
source data.
• Responsible for configuring the workflow tasks like email task, Command task and make it
reusable for other team members.
• Used PMCMD command to start, stop and ping server from UNIX.
• Responsible for Unit Test and UAT to check the data quality and documenting the test
results.

Environment: Informatica Power Center 9.x, Oracle 9i, Sql Server, Greenplum, Flat Files, control-M
and PL/SQL.

Education
M.Tech (Software Engg) Jawaharlal Nehru Technological University, Hyderabad.

Internal Use - Confidential

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy