Murali Datastage Developer 8+years
Murali Datastage Developer 8+years
Mail: Polamn.acnts.us@gmail.com
Mobile No: +1 (651) 315 0297
EXPERIENCE SUMMARY
• Passionate and dedicated data engineer experience with 8+ years’ experience in DWH, ETL/ELT, IBM
Datastage, Big Data & BI Enterprise application and SQL based technologies.
• Designed and implemented complex data architectures using the data platform capabilities like IBM
Datastage, Cognos, Big Data, Hadoop, Hive, Sqoop.
• Expertise in using tools like SQOOP to ingest data into Hadoop HDFS
• Good Experience in Agile methodology, Worked with Agile tool like Jira.
• Worked for Various projects in different domains like Banking, Financial, Retail, infrastructure Industries.
• Played significant role in various phases of project life cycle, such as requirements definition, technical design
and development, testing, Production Support, and implementation.
• Extensively worked in implementing Data Warehousing concepts like Star schema, Snowflake schema, Data
Marts, ODS, Dimension and Fact tables.
• Strong ability in identifying problems, developing innovative ideas, and finding the best solutions to suit in the
prescribed environment.
• Extremely self-motivated individual who thrives on meeting tight deadlines and is comfortable in a pressurized
environment. Broad technical awareness with the ability to communicate at all levels.
• Good exposure in managing projects with Onsite - Offshore model and experienced in managing development
teams in India, US, Australia.
EDUCATION
CONSULTING/TECHNICAL SKILLS
Page | 1
PROFESSIONAL EXPERIENCE
VF Corporation is one of the world’s largest apparel, footwear and accessories companies connecting people to
the lifestyles, activities, experiences they cherish most through a family of iconic outdoor, active and workwear
brands. VF Corporation mainly focusing on the three areas growth and profitability, Portfolio Strategy &
Segmentation, Performance Management & Joint Business Planning.
RESPONSIBILITIES
• Understanding VFC functional and non-functional requirements from business users and perform gap
analysis
• Design the new business requirements
• Design of complex mappings which includes business transformations for data transfer from source
system to target.
• Responsible for identifying any risk during project execution and arriving to the solution to cater the risks
at an early stage of the project
• Design the application to manage the flow of data / files; like moving from source to intermediate / staging
to target
• Working on Cognos reporting tool, to run and monitor the jobs and change the report distribution
details.
• Prepare status reports and other operation related document for software maintenance.
• Involved in implementing different logics like incremental loading, change capturing and slowly changing
dimensions.
• Analyzing the dependencies among various interfaces and designs and suggesting the necessary
improvements in the system
• Using Unix Commands for Checking the Validation of files
• Written Unit Test cases and submitted Unit test results as per the quality process.
• Deploying the code in Prod environment using GIT.
• Monitoring the jobs through CNTRL-M scheduler
• Preparing the necessary documentation on gap analysis and design
• Follow the change management process for any new change request.
• Optimized the application code for Performance gain.
Environment: IBM Data Stage PX 11.5, Cognos, DB2, Control-M, Jira, Q-test, Linux
Page | 2
EMPLOYER: HCL Technologies
The United Services Automobile Association (USAA) is a San Antonio-based Fortune 500 diversified financial
services group of companies including a Texas Department of Insurance-regulated reciprocal inter-insurance
exchange and subsidiaries offering banking, investing, and insurance to people and families who serve, or served,
in the United States Armed Forces..
RESPONSIBILITIES
• Designed and developed E2E data flow starting from source database such as Teradata/DB2 to target
data model in hive which includes denormalisation of 327 source table to big data Environment.
• Sqoop the high volume of data from teradata and DB2 databases.
• Using HDFS and Unix Commands for Checking the Validation of files.
• Created databases and tables (external and internal) in Hive compression techniques on top of HDFS
files.
• Designing and executing hive scripts to build denorm layer based on Required documents.
• Performances tuning in hive through tez, mr, vectorized execution, orc, hive hints and bucketmapjoins.
• Tested the target table data as per the driver table date range or not and delivering the release notes.
• Written Unit Test cases and submitted Unit test results as per the quality process
• Monitoring the Production Jobs and delivered the handover sheet with time taken, table count, validation
result.
Environment: IBM Data Stage PX 11.5, Big Data, Hadoop, Hive, Python, Sqoop, Oracle, Linux, CNTRL-M
Vic Roads is the Roads Corporation of Victoria is a statutory corporation which is the road and traffic authority in
the state of Victoria, Australia. It is responsible for maintenance and construction of the arterial road network, as
well as vehicle registration. Vic Roads has broad responsibility for road safety policy and research. It is also
responsible for regulating the accident towing industry in Victoria. Provide traffic information with a granularity
that better supports the business needs of VicRoads data consumers.
RESPONSIBILITIES
• Involved to get and understand the business rules completely and implements the data transformation
methodology.
• Extracting data from sources such as Flat files to transform and load into target Databases
• Designed and developed the DataStage Parallel jobs for extracting, cleansing, transforming, integrating,
and loading data using DataStage Designer.
• Extensively wrote user-defined SQL for Auto generated SQL query in Data Stage.
Page | 3
• Redesigned and modified existing program logic to improve overall system performance
• Worked with Data Stage Manager for Export & Import Jobs and Importing the Metadata from repository.
• Involved to prepare the HLD documents of the user stories and review meetings.
• Using Unix Commands for Checking the Validation of files
• Written Unit Test cases and submitted Unit test results as per the quality process.
• Deploying the code in Prod environment using GIT.
• Monitoring the jobs through CNTRL-M scheduler.
RESPONSIBILITIES
Page | 4