0% found this document useful (0 votes)
121 views8 pages

Santhoshi Musku

Please find the attached

Uploaded by

Max Thomas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
121 views8 pages

Santhoshi Musku

Please find the attached

Uploaded by

Max Thomas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Santhoshi Musku (ETL Consultant) GC-EAD (employment based

through H1)
santhoshi.9833@gmail.com
656 216 9643
Summary:
● Over 15+ years of experience in IT Professional, Analysis, Design, Development,
Testing and Implementation of business application systems for Health care,
Financial, Telecom and Timeshare Sectors.
● Strong experience in the Analysis, design, development, testing and
Implementation of Business Intelligence solutions using Data Warehouse/Data Mart
Design, ETL, OLAP, BI, Client/Server applications.
● Strong experience in Dimensional data modelling methodologies of Ralph Kimball
and Bill Inmon.
● Around 14 yrs strong Data Warehousing ETL experience of using Informatica
PowerCenter 10.5/10.1/9.6/9.1/8.x Informatica PowerCenter Client tools - Mapping
Designer, Repository manager, Workflow Manager/Monitor and Server tools –
Informatica Server, Repository Server manager.
● 4 years of hands-on experience in Informatica (IDMC) IICS Data integration (CDI) /
Cloud Application integration (CAI)
● Worked with IICS/IDMC Data Integration Components like Task flow, Mappings
and all tasks etc.
● Experience working with creating Json files in IICS CAI. Used Service connectors,
App connectors, Process Objects in Cloud Application Integration.
● Knowledge of designing the datamodelling/database structure by identifying Facts
and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
Expertise in working with various sources such as Oracle 11g/10g/9i/8x, SQL
Server 2008/2005, flat files, COBOL/VSAM,DB2 Mainframe.
● Extensive experience in developing Stored Procedures, Cursors,Functions, Views
and Triggers, Complex SQL queries using SQL Server, TSQL and Oracle PL/SQL.
Utilized AUTOTRACE and EXPLAIN PLAN for monitoring the SQL query
performance.
● Experience in resolving on-going maintenance issues and bug fixes; monitoring
Informatica sessions as well as performance tuning of mappings and sessions.
● Created pre-session, post session, pre-sql, post sql commands in Informatica.
Worked with different Informatica transformations like Aggregator, Lookup, Joiner,
Filter, Router, Update strategy, Transaction Control, Union, Normalizer, SQL in
ETL development.
● Worked with Event wait and event raise tasks, mapplets, reusable transformations.
● Experience in Integration, Functional, Regression, System Testing, Load Testing,
and UAT Testing.
● Worked with Parameter file for ease of use in connections across Dev/QA/Prod
environments. Experience in all phases of Data warehouse development from
requirements gathering for the data warehouse to develop the code, Unit Testing
and Documenting.
● Extensive experience in writing UNIX shell scripts and automation of the ETL
processes. Worked with Informatica Deployment groups, batch scripts for code
migration from one env to another env. Used Version Control.
● Experience in using Automation Scheduling tools like Autosys, Tidal. Worked with
QA team to perform testing, writing test cases, documenting test results, raising
defects in HP QC.
● Worked extensively with slowly changing dimensions SCD Type1 and Type2.
● Familiar with reporting tools like OBIEE, Business objects, Cognos reportnet.
● Created Business Requirement Documents (BRD), Functional Requirement
Documents (FRD), data flow diagrams.
● Worked end to end in SDLC and experience in creating and developing business
requirements.
● Worked in Agile and waterfall methodologies.
Project Management tools – JIRA ,Service now, TFS , Sharepoint,Azure Devops
Version control tools – SVN ,GIT , Jetkins

Education:
● Bachelor of Engineering in Electrical and Electronics,JNTU, Hyderabad,India, 2003
passed
LinkedIn: https://www.linkedin.com/in/santhoshi-reddy-m-785a6b255/

Professional Experience:

Client: Ever Bank ,Remote


Raleigh,NC
Role : ETL /Informatica IICS/IDMC Consultant Nov
2023 – Till Date
Responsibilities:
● Worked closely with BSA and business users to gather and understand
requirements.
● Prepare technical specifications document and provide estimates for Development
and unit tasks.
● Design and develop ETL mappings/Sessions/Workflows.
● Worked on production support tickets, resolving the data conflicts.
● Migrated taskflows/mapping tasks from one environment to another in
Informatica.
● Created multiple mapping tasks, taskflows in IICS/IDMC CDI (Data Integration).
● Worked with IICS components File listener , Saved Query and transformations like
Hierarchy parser , builder , Expression Macros , in-out parameters .
● Worked with Linear , Sequential , parallel taskflows to integrate various Data
transformation tasks.
● Experience in Creating different connectors like Oracle, SQL Server, Snowflake,
AWS S3 in IICS Administrator.
● Pulled the data from AWS S3 buckets to load into redshift database.
● Expertise in Data integration part DSS, Replication tasks, Mappings/MCT,
Scheduling and Rest-API calls.
● Used Postman tool to connect, GET, FETCH data using URL for API.
● Worked with Payload option in CAI , connectors when connecting with JSON files.
● Worked with Python , variables , operators , strings , loops etc scripts.
● Created different steps in Taskflows like Assignment Step, Notification Task,
Command Task, File Watch Task, Decision Step, Parallel Path, Jump Step.
● Experience in Creating Service Connectors, App connections,Process Objects by
using Cloud Application Integration (CAI).
● Pulled the metadata from Informatica cloud server to collect Informatica taskflow
status , failures , stats and import the data into the local tables.
● Importing the data from swagger file using REST v2 connector and loading the
data into table.
● Automated/Scheduled IICS (IDMC) cloud jobs to run daily with email notifications
for any failures.
● Worked with flat file source and target to PUSH the data to end user. Worked with
Oracle, Sql server source,snowflake target to load the data into datawarehouse.
● Familiar with SnowSQL/Commands environment to perform sql operations.
● Worked with SCD Type 2 to maintain history of the incoming data from business.
● Used performance techniques to improve both database and Informatica
performance by identifying the bottle necks.
● Used Informatica Parametrization to increase flexibility, reusability and readability.
● Maintain end to end documentation of project activities like Technical specs doc,
ETL design doc, production support doc.
● Worked with Unix Shell scripting for file management and automation , scheduling
of Informatica power center workflows.
● Worked with SQL, PL/SQL stored procedures , migrated the code from PL/SQL to
Informatica.
● Worked in Production support team to ensure the support tickets are taken care
and resolved based on priority as per SLA.
● Scheduled Informatica workflows using TIDAL Scheduler.

Environment: Informatica PowerCenter 10.x, Informatica Cloud (IICS)/IDMC ,Oracle,


SQL Server , Snowflake DB

Client : Ajel LTD , Remote


Role: ETL/Informatica IICS/IDMC Consultant Sept
2021 – Oct 2023
Responsibilities:
● Lead design, development, and implementation of the ETL projects end to end.
Responsible for ETL technical design discussions and prepared ETL high level
technical design document. Worked with Business users to understand the data,
understand the business and gather requirements.
● Interacting with onsite and offshore team to assign Development tasks and
scheduling weekly status calls with offshore team on status. Extracted data from
flat files, Oracle, Sql server using Informatica ETL mappings and loaded to Data
Mart.
● Created complex Informatica mappings using transformations Unconnected
Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router
transformations to extract, transform and loaded data mart area.
● Worked with Informatica Cloud Data integration to develop Mapping
tasks,taskflows ,use parameterization .
● Created connections using Service connector WSDL file using SOAP API’s in CAI.
● Used postman tool to connect to Informatica cloud server to import metadata using
CAI process.
● Pulled the data from AWS S3 buckets to load into sql server database.
● Read and load the data from Sql server to load into Redshift database.
● Used IICS(IDMC) CDI components , mapping tasks, mapplets , taskflows, in-out
parameters , Saved query .
● Worked with Synchronization task, Data integration tasks , replication tasks.
● Improved Mapping performance using ELT partitioning in IICS.(IDMC) Worked
with Horizontal and vertical macros.
● Installed and configured secure agent as part of IDMC admin activities.
● Worked with pmcmd commands to invoke informatica workflows and schedule
dependency.
● Worked extensively on shell scripting for file management.
● Developed automated reports to generate the ETL test results for data errors and
email to end users/Developers for corrections.
● Used excel and excel functions to perform calculations based on multiple columns.
● Extensively used Excel functions like, IF, MATCH, LOOKUP, VLOOKUP, PIVOT,
SUM, SUMIF etc..
● Created re-usable transformations/mapplets and used across various mappings
● Wrote complex PL/SQL scripts /functions/procedures/packages.
● Developed Informatica workflows/worklets/sessions associated with the mappings
using Workflow Manager.
● Worked on performance tuning of Informatica sessions using partitioning like pass
through and dynamic partitioning.

Environment: Informatica Powercenter 10.4,IICS, CDI,CAI, Flatfiles, AWS S3,


Python,Azure, redshift ,sqlserver, oracle.

Client: Deloitte (Medifast Project),Remote


Role: Informatica IICS (IDMC)/SSIS consultant Feb 2021- August
2021
Responsibilities:
● Worked with IICS(IDMC) tool to develop ETL mappings to pull the data from
various sources like Amazon S3, Sql server.
● Installed and configured secure agent in IDMC (IICS) environment .
● Worked with RunaJob CLI utility to schedule jobs in IDMC using the scripting.
● Gather business requirements, prepare STTM (Src to Tgt mapping) document,
develop, deploy, and support projects.
● Worked with IDMC components CDI , especially with Taskflows , maiing tasks and
mappings.
● Migrated the taskflows from Dev to UAT to Prod in IDMC using import/export.
● Used CAI components to import data from external sources using the JSON/XML
process objects to import data into local database.
● Invoked data integration components from CAI process to automate the taskflows.
● Scheduled the IDMC jobs using the cloud schedular in admin console.
● Provided code reviews, helped junior developers in the team in development
activities.
● Involved in Audit framework design, development and implementation.
● Worked with QA team to fix code issues and deploy objects from one environment
to another.
● Designed & developed a recon process between various source systems and target
table to generate emails if any discrepancies in counts or key columns like AMT.
● Work in Production support tasks (BAU) on weekly basis.
● Designed, developed, tested and deployed SSIS packages that implement ETL
processes.
● Worked with MDS (Master Data services ) for master data management to connect
to SQL server and update/insert with in Sqlserver with excel.
● With MDS , data is loaded into staging tables, validated using business rules, and
then loaded the data into MDS tables.
● Work on any adhoc requests and issues from production jobs/enhancements or
reports data.
● Work as individual contributor and a team player as per the project timelines.
● Tidal is being used for Scheduling the Informatica power center workflows.
● Worked with VSAM(COBOL, mainframe) files as source, pulled the data using
Normaliser transformation.

Environment: Informatica Powercenter,IICS(IDMC), Amazon Redshift S3(AWS),


sqlserver, oracle.

Client: THE Economist, Remote


Role :Team Lead/ Sr ETL IICS(IDMC) Consultant Sep
2020 – Jan 2021
Responsibilities:
 Worked exclusively in IICS(IDMC) Data integration and CAI projects.
 Worked with IICS(IDMC) Data Integration Components like Task flow,
Mappings, MCT, taskflows ,DSS,Replication task etc.
 Developed multiple mappings, MCT, taskflows to load the data from
files/sqlserver, snowflake to sqlserver.
 Worked with CAI components, process , service steps, process
connectors,process objects etc in CAI.
 Invoked MCT using API, worked with REST API in CAI.
 Experience in Creating different connectors like Oracle, SQL Server, Snowflake,
AWS S3, Salesforce in IICS Administrator.
 Created Different kinds of Tasks in IICS(IDMC) like Mapping Task, Data
Synchronization tasks, Dynamic Mapping Task and Data Transfer Task in IICS
using various data sources as Oracle DB, saved queries and targets as
Snowflake.
 Worked with unstructured, semi structured files like XML and Jason files.
 Deployed the IICS components from one environment to another, from Dev to
QA & PROD.
 Created complex SSIS packages using proper control and data flow elements
with error handling.
 Designed and developed SSIS packages to move data from various sources into
destination flat files and databases.
 Troubleshoot previous SSIS packages to solve performance tuning issues.
 Modified existing SSIS scripts and processes for enhancements.
 Wrote SQL, PL/SQL, SQL PLus programs required to retrieve data using
cursors and exception handling.
 Performed SQL and PL/SQL tuning and application tuning using various tools
like tkprof, autotrace and dbms_SQL tune.

Environment: Informatica Powercenter ,IICS(IDMC), SSIS,snowflake, sqlserver,


XML,JASON.

Client: HealthNet/Centene, Folsom, CA(USA) Jan


2017 – Oct 2019
Title: Sr. Informatica Developer
Description: HealthNet has the requirement to create RADM (Reporting and
Analytics Data Mart) to analyze the Provider Performance. The Datamart is designed
and developed with Claims, Members and Provider data. Key player in ACA
submissions to EDIFECS edge Server and Data Submissions to CMS for Medicare &
Duals for Professional & Institutional.

Responsibilities:
● Owned several projects independently and is a key player in data submissions and
data validation. Created Database objects/views/stored procedures to replace ETL
code. Worked with SQL Analytical functions, PIVOT etc.
● Developed ETL process in Sql server management studio using SSIS packages.
Implemented Informatica performance tuning and database performance tuning by
analyzing the query using Explain plan.
● Implemented partitioning techniques at database and Informatica level to improve
the performance. Worked with Teradata SQL utility tools, bteq, fastexport to export
the data from & to database and files.
● Familiar with healthcare x12 format 835/837 transactions,EDI data exchanges.
● Created and modified SQL PLus, PL/SQL and SQL Loader scripts for data
conversions.
● Worked with Error corrections team to analyze errors in Claims data received from
Provider groups. Understand the Claims process and worked with Provider groups
to help in Technical details and receive correct data from the Providers in
accordance with HIIPA regulations.
● Worked with Pharmacy data submissions to dhs&dhcs for CA market. Developed
the SCD Type2 Informatica mappings to load the data into Dimension and Fact
tables. Used various Informatica transformations like Lookup, Joiner, Aggregator,
Reusable objects, SQL transformation, store proc to perform ETL tasks.
● Lead the team by working with Business Analysts to get the requirements and
prepare the ETL Source to target mapping document/ETL Specifications doc.
Developed Unix Scripts for Informatica Powercenter jobs, file processing via sftp
process.
● Performed peer reviews for ETL Code for best practices, consistency, quality and
performance. POC for Informatica production support, POC for several reports
submitted to third party vendors.
● Automated several tasks in Informatica process to eliminate manual interference.
Performed Data analysis across different source systems for gap analysis.
● . Worked on Pharmacy claims and generated monthly KPI reports. Familiar with
Pharmacy data and reversed claims. Heavily used SQL queries, MS Access,
Teradata, Excel to create reports for higher management.
● Worked with Advanced Excel functions like VLOOKUP , LOOKUP ,SUM, MATCH
etc..
● Good understanding of creating reports in OBIEE, dashboards, Subject areas, Sql
functions, Advance sql options.
● Familiar with OBIEE reports debugging and reverse engineering incase of data
issues.
● Upgrade the Informatica Powercenter from 9.x to 10.x , migrate mappings from
one environment to another. Performed complete integration testing after upgrade.
● Worked with Netezza ODBC connector to load the data from File to Netezza target
database.
● Loaded the data into Netezza using the FIFO concept to load the data from source
file.
Environment: Informatica Power center 9.6/10.1,SSIS,SSRS,Power BI, OBIEE 11g,Sql
server 2008 R2, Oracle 11g, FTP, HIPAA, Teradata, Netezza, MS Access

Client: Bank of the West, San Ramon, CA January 2016 –


December 2016
Role: Lead/Sr. Informatica Developer
Responsibilities:
● Worked closely with BSA and business users to gather and understand
requirements. Prepare technical specifications document and provide estimates for
Development and unit tasks. Design and develop ETL
mappings/Sessions/Workflows.
● Worked with flat file source and target to PUSH the data to end user. Worked with
Oracle, Sql server source, target to load the data into datawarehouse. Worked with
SCD Type 2 to maintain history of the incoming data from business.
● Worked with team in design and development of Informatica restartability. Worked
with IDQ(informatica data quality) developer tool to create mapplets/rules/LDO and
mappings. Used Data Quality transformations like Labeler transformation and case
converter transformation.
● Extensively worked with UNIX scripting to perform tasks like file transfer, email
reports as attachments, operate ETL jobs etc. Worked with SQL scripts /stored
procedures in understanding the existing business logic and develop the same in
Informatica mappings.
● Used performance techniques to improve both database and Informatica
performance. Used Informatica Parametrization to increase flexibility, reusability
and readability. Worked with complex transformations like lookup, aggregator, SQL
transformation, Normalizer, Joiner.
● Maintain end to end documentation of project activities like Technical specs doc,
ETL design doc, production support doc.

Environment: Informatica Power center 9.6/10.1, OBIEE 11g,Sql server 2008 R2,
Oracle 11g, FTP, CCAR , MS Access

Client: Genuine Parts Company, Atlanta, GA June 2015 –


December 2015
Role: Lead/Sr. Informatica Developer
Responsibilities:
● Involved in all phases of the project SDLC to gather business requirements.
Worked closely with the Business analyst to understand the various source data.
Involved in designing of star schema based data model with dimensions and facts
● Designed ETL mapping document to map source data elements to the target based
in Star-Schema dimensional model. Used Informatica Dynamic partitioning, push
down optimization to improve session performance.
● Worked with stored proc to truncate table, drop indexes and recreate indexes to
use in pre-session and post session properties. Worked with Scheduler to schedule
Informatica sessions on daily basis and to send an email after the completion of
loading.
● Created complex SCD type 1 & type 2 mappings using dynamic lookup, Joiner,
Router, Union, Expression and Update Transformation transformations. Used
parameter files for storing database connection of sessions for reusability across all
the environments Dev/QA/PROD.
● Lead the off-shore team for development and production support tasks. Worked
with QA team to debug and resolve defects raised during QA phase.
● Used reusable tasks like session task, email task, command task in workflow
manager. Worked in production support tasks, on-call rotation basis for daily
production jobs.

Environment: Informatica Power center 9.6, Oracle,Sql server 2008 R2, Oracle 11g,
FTP.

Client: AT&T, Atlanta, GA Oct 2013 – May 2015


Role: Lead/Sr. Informatica Developer
Responsibilities:
● Lead design, development, and implementation of the ETL projects end to end.
Responsible for ETL technical design discussions and prepared ETL high level
technical design document. Worked with Business users to understand the data,
understand the business and gather requirements.
● Involved in the analysis of source to target mapping provided by data analysts and
prepared function and technical design documents. Involved in designing of star
schema-based data model with dimensions and facts.
● Interacting with onsite and offshore team to assign Development tasks and
scheduling weekly status calls with offshore team on status. Extracted data from
flat files, Oracle, Sql server using Informatica ETL mappings and loaded to Data
Mart.
● Created complex Informatica mappings using transformations Unconnected
Lookup, joiner, Rank, Source Qualifier, Sorter, Aggregator, Lookup and Router
transformations to extract, transform and loaded data mart area.
● Used Teradata utilities fastload, multiload, tpump to load data from various source
systems. Worked extensively on shell scripting for file management. Created re-
usable transformations/mapplets and used across various mappings
● Wrote complex PL/SQL scripts /functions/procedures/packages.
● Developed Informatica workflows/worklets/sessions associated with the mappings
using Workflow Manager.
● Worked on performance tuning of Informatica sessions using partitioning like pass
through and dynamic partitioning.

Client: Wyndham Vacation Ownership, Orlando, FL January 2012 –


September 2013
Role: Sr. Informatica
Responsibilities:
● Work with offshore/onsite team and lead the project and assign tasks appropriately
to the team members. Responsible for projects estimates, design documents,
resource utilization and allocations.
● Worked with transformations like Normalizer, Update Strategy, Lookup, Router,
Filter, and Joiner. Worked closely with Data Modeler in designing the Data Mart
Dimension tables and Fact Tables.
● Created Parameter files for Connection strings in Workflow Manager for reusability
of Connections across Dev/QA/Prod Environments.
● Migrated the code from lower environments Dev/QA to prod from Repository
Manager using deployment groups.
● Pulled the data from Salesforce and load the data into Sql server db using
Informatica cloud. Familiar with Salesforce Objects like Accounts, Leads, Contacts,
Customers etc.
● Worked with COBOL (.cbl) files (VSAM , mainframe ) as source, pulled the data
using Normaliser transformation by using the .cpy format.
● Worked with Unix Shell scripting for file management. Used pmcmd commands to
execute Informatica Powercenter workflows.

Client: Bayview Financial, Miami, FL August 2007 –


December 2011
Role: Sr. Informatica Developer
Responsibilities:
● Lead and interacted with Business Analyst to understand the business
requirements. Involved in analyzing requirements to refine transformations.
● Responsible for mentoring Developers and Code Review of Mappings developed by
other developers
● Created complex mappings in Power Center Designer 8.6 using Aggregate,
Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup,
Joiner and Stored procedure transformations.
● Implemented performance tuning by identifying the bottlenecks in informatica
mappings and sessions and also using explain plan in oracle using TOAD.
● Wrote SQL, PL/SQL, stored procedures & triggers, cursors for implementing
business rules and transformations.
● Created pre-session, post session, pre-sql, post sql commands for email
notifications with the Email Task, also to update target tables after the data is
loaded.
● Created and used tasks like Email Task, Command Task, and Control task in
Informatica workflow manager and monitored jobs in Workflow Monitor.
● Developed UNIX scripts for file management like to zip & unzip the files. Developed
the automated and scheduled load processes using TIDAL Scheduler.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy