0% found this document useful (0 votes)
19 views

Murali Datastage Developer 8+years

The document provides a summary of an individual's experience as a senior data engineer, including details about their roles and responsibilities across multiple employers and projects involving technologies like IBM Datastage, Hadoop, Hive, and Sqoop.

Uploaded by

luvsophia1234
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views

Murali Datastage Developer 8+years

The document provides a summary of an individual's experience as a senior data engineer, including details about their roles and responsibilities across multiple employers and projects involving technologies like IBM Datastage, Hadoop, Hive, and Sqoop.

Uploaded by

luvsophia1234
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Murali P – Senior Data Engineer

Mail: Polamn.acnts.us@gmail.com
Mobile No: +1 (651) 315 0297

EXPERIENCE SUMMARY

• Passionate and dedicated data engineer experience with 8+ years’ experience in DWH, ETL/ELT, IBM
Datastage, Big Data & BI Enterprise application and SQL based technologies.
• Designed and implemented complex data architectures using the data platform capabilities like IBM
Datastage, Cognos, Big Data, Hadoop, Hive, Sqoop.
• Expertise in using tools like SQOOP to ingest data into Hadoop HDFS
• Good Experience in Agile methodology, Worked with Agile tool like Jira.
• Worked for Various projects in different domains like Banking, Financial, Retail, infrastructure Industries.
• Played significant role in various phases of project life cycle, such as requirements definition, technical design
and development, testing, Production Support, and implementation.
• Extensively worked in implementing Data Warehousing concepts like Star schema, Snowflake schema, Data
Marts, ODS, Dimension and Fact tables.
• Strong ability in identifying problems, developing innovative ideas, and finding the best solutions to suit in the
prescribed environment.
• Extremely self-motivated individual who thrives on meeting tight deadlines and is comfortable in a pressurized
environment. Broad technical awareness with the ability to communicate at all levels.
• Good exposure in managing projects with Onsite - Offshore model and experienced in managing development
teams in India, US, Australia.

EDUCATION

- Bachelor of Technology from Jawaharlal Nehru Technological University.

CONSULTING/TECHNICAL SKILLS

Cloud Solutions : Snowflake


Big Data Technologies : Hadoop, HDFS, Hive, Sqoop
ETL Transformations : IBM Web Sphere Data Stage PX 8.0.1,8.5, 9.1 and 11.5
Reporting/Other Tools : Cognos 8.4, Control-M, Active Batch, Jira, Qtest
Database(s) : ORACLE 10g, DB2 and Terdata
Languages/OS : SQL, Unix and Linux
Change Management : HP Service Management

Page | 1
PROFESSIONAL EXPERIENCE

EMPLOYER: Wipro technologies

CLIENT: VF Corporation Nov 2021- May 2022


ROLE: BI Technical Lead

VF Corporation is one of the world’s largest apparel, footwear and accessories companies connecting people to
the lifestyles, activities, experiences they cherish most through a family of iconic outdoor, active and workwear
brands. VF Corporation mainly focusing on the three areas growth and profitability, Portfolio Strategy &
Segmentation, Performance Management & Joint Business Planning.

RESPONSIBILITIES
• Understanding VFC functional and non-functional requirements from business users and perform gap
analysis
• Design the new business requirements
• Design of complex mappings which includes business transformations for data transfer from source
system to target.
• Responsible for identifying any risk during project execution and arriving to the solution to cater the risks
at an early stage of the project
• Design the application to manage the flow of data / files; like moving from source to intermediate / staging
to target
• Working on Cognos reporting tool, to run and monitor the jobs and change the report distribution
details.
• Prepare status reports and other operation related document for software maintenance.
• Involved in implementing different logics like incremental loading, change capturing and slowly changing
dimensions.
• Analyzing the dependencies among various interfaces and designs and suggesting the necessary
improvements in the system
• Using Unix Commands for Checking the Validation of files
• Written Unit Test cases and submitted Unit test results as per the quality process.
• Deploying the code in Prod environment using GIT.
• Monitoring the jobs through CNTRL-M scheduler
• Preparing the necessary documentation on gap analysis and design
• Follow the change management process for any new change request.
• Optimized the application code for Performance gain.

Environment: IBM Data Stage PX 11.5, Cognos, DB2, Control-M, Jira, Q-test, Linux

Page | 2
EMPLOYER: HCL Technologies

CLIENT: USAA Oct-2019 Oct-2021


ROLE: Senior Software engineer

The United Services Automobile Association (USAA) is a San Antonio-based Fortune 500 diversified financial
services group of companies including a Texas Department of Insurance-regulated reciprocal inter-insurance
exchange and subsidiaries offering banking, investing, and insurance to people and families who serve, or served,
in the United States Armed Forces..

RESPONSIBILITIES

• Designed and developed E2E data flow starting from source database such as Teradata/DB2 to target
data model in hive which includes denormalisation of 327 source table to big data Environment.
• Sqoop the high volume of data from teradata and DB2 databases.
• Using HDFS and Unix Commands for Checking the Validation of files.
• Created databases and tables (external and internal) in Hive compression techniques on top of HDFS
files.
• Designing and executing hive scripts to build denorm layer based on Required documents.
• Performances tuning in hive through tez, mr, vectorized execution, orc, hive hints and bucketmapjoins.
• Tested the target table data as per the driver table date range or not and delivering the release notes.
• Written Unit Test cases and submitted Unit test results as per the quality process
• Monitoring the Production Jobs and delivered the handover sheet with time taken, table count, validation
result.

Environment: IBM Data Stage PX 11.5, Big Data, Hadoop, Hive, Python, Sqoop, Oracle, Linux, CNTRL-M

CLIENT: VIC Roads SEP-2016 TO Sep-2019


ROLE: Senior Software Engineer

Vic Roads is the Roads Corporation of Victoria is a statutory corporation which is the road and traffic authority in
the state of Victoria, Australia. It is responsible for maintenance and construction of the arterial road network, as
well as vehicle registration. Vic Roads has broad responsibility for road safety policy and research. It is also
responsible for regulating the accident towing industry in Victoria. Provide traffic information with a granularity
that better supports the business needs of VicRoads data consumers.

RESPONSIBILITIES
• Involved to get and understand the business rules completely and implements the data transformation
methodology.
• Extracting data from sources such as Flat files to transform and load into target Databases
• Designed and developed the DataStage Parallel jobs for extracting, cleansing, transforming, integrating,
and loading data using DataStage Designer.
• Extensively wrote user-defined SQL for Auto generated SQL query in Data Stage.
Page | 3
• Redesigned and modified existing program logic to improve overall system performance
• Worked with Data Stage Manager for Export & Import Jobs and Importing the Metadata from repository.
• Involved to prepare the HLD documents of the user stories and review meetings.
• Using Unix Commands for Checking the Validation of files
• Written Unit Test cases and submitted Unit test results as per the quality process.
• Deploying the code in Prod environment using GIT.
• Monitoring the jobs through CNTRL-M scheduler.

Environment: IBM Data Stage PX 9.1, Oracle, Linux

CLIENT: Commonwealth Bank of Australia NOV 2013 TO AUG 2016


ROLE: Senior Software Engineer
Common wealth Bank is an Australia multinational banking and financial services company headquartered in
Sydney, Australia. This Project is the Migration of 8.1 server edition to 9.1 server edition. Datastage is a critical
component of the Group BU service offering for Business Intelligence (BI). Datastage is currently running on version
8.1 and is supported by the vendor IBM. Vendor support for version 8.1 is scheduled to terminate September 2014.
To enable ongoing vendor support the Datastage application must be upgraded. The primary objective of the
project is to upgrade Datastage to a vendor supported version. It involves the increase the performance of jobs
than 8.1 version.

RESPONSIBILITIES

• Setting up environment for each project in both DS 8.1 and DS 9.1.


• Doing Dry run to ensure all the objects and provided required grants for database objects.
• Running those jobs in 8.1 and 9.1 environments through Autosys supporting tool in UNIX and solve the
issues if any job was aborted.
• Compare the files are generated in both environments and prepared minus queries for both environment
tables to ensure check the same result.
• Compare the job Running Times with previous Version.
• Worked with Data Stage Designer for Export & Import Jobs and importing the jobs from Dev to Test
Region.
• Prepared the code change documents when code change was happened.
• Using Unix Commands for Checking the Validation of files
• Written Unit Test cases and submitted Unit test results and Issue log as per the quality process.

Environment: DS 8.1, DS 9.1, Teradata, Oracle, Autosys Scheduler, UNIX

Page | 4

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy