Shravani R
Shravani R
• 5+ years of Professional Experience in working with SQL, Snowflake cloud data warehouse with
AWS/AZURE cloud
• Experience in Data Migration from SQL Server to Snowflake cloud data warehouse Experience in
implementing Data Lake on AWS/AZURE and Data Warehouse on Snowflake Data Cloud.
• Deep understanding of Snowflake architecture, Caching and Virtual data warehouse scaling. Expertise in
bulk loading into Snowflake tables using COPY command.
• Experience in Creation of Snowflake objects like databases, schemas, tables, sequences, views and file
formats.
• Created data pipelines with streams and tasks.
• Exposure in processing semi structured data files like JSON and Parquet into snowflake. Experience in
using zero copy clone to replicate Lower environments from production.
• Handled large and complex data sets in CSV format from object stores like AWS S3.
• Experience in Snowflake features such as Data sharing, SnowPipe, Tasks, Zero copy clone and Time
travel.
• Experience in relational database SQL Server and developed procedures, functions and views.
Implemented SCD Type1 & SCD Type 2 data loads with streams and tasks in snowflake.
• Experience with all phases of software development life cycle (SDLC) and Agile methodologies.
• Effective problem solving and analytical skills with expertise in interacting with Business Users.
EducationalQualification:
• Working as Sr Data Engineer at Brillio Technologies, Bangalore from September 2019 to till date.
EducationalQualification:
TechnicalProfile:
Project # 1
Project : Enterprise Data Warehouse
Client : NatWest Bank
Project Profile:
National Westminster Bank Plc, trading as NatWest, is a major retail and commercial bank in the United
Kingdom based in London, England. It was established in 1968 by the merger of National Provincial Bank
and Westminster Bank
Role and Responsibilities:
Responsible for end-to-end Data Migration from RDBMS to Snowflake Cloud Data Warehouse
Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.
Creating views and materialized views in Snowflake cloud data warehouse to Business
Analysis and reporting.
Responsible for all activities related to the development, implementation and support of ETL
processes for large scale Snowflake cloud data warehouse.
Responsible for all activities related to the development, implementation, administration and
support of ETL processes for large scale data warehouses using Snowflake cloud Data
warehouse.
Create task flows to automate the data loading/unloading to/from Snowflake data warehouse
and AWS S3
Created internal and external stage and transformed data during load.
Created File Formats, Functions, Views etc.
Cloned databases for Dev and QA environments using Zero Copy Cloning.
Worked on Snowflake streams to process incremental records and for CDC.
Project # 2
Project : Pennymac Loan Mart
Client : PennyMacInc
Environment tools: SQL,Asure,AWS S3,Snowflake
Project Profile:
PennyMac Financial Services, Inc. is an American residential mortgage company headquartered in Westlake
Village, California. The company's business focuses on the production and servicing of U.S. mortgage loans
and the management of investments related to the U.S. mortgage market.
Pennymac Loan Mart, gives the complete overview and status of loans which are in different stages of
loan cycle. Like, number of loans sliced based on different dimensions.
Role and Responsibilities:
• Responsible for all activities related to the development, implementation and support of ETL processes for
large scale Snowflake cloud data warehouse.
• Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY command.
• Creating views and materialized views in Snowflake cloud data warehouse to Business Analysis/reporting.
• Responsible for all activities related to the development, implementation, administration and support of ETL
processes for large scale data warehouses using Snowflake cloud Data warehouse.
• Create task flows to automate the data loading/unloading to/from Snowflake data warehouse and AWS S3
• Created internal and external stage and transformed data during load.
• Created File_Formats, Functions, Views etc.
• Cloned databases for Dev and QA environments using Zero Copy Cloning.
• Worked on Snowflake streams to process incremental records and for CDC.
Project # 3
Project : Medical Aesthetics
Client : Allergan
Project Profile:
Allergan is an American, leading pharmaceutical company. Medical Aesthetics project is about to create
data warehouse in Snowflake. Downstream and reporting team is consume data from MA. Business leaders,
managers and sales representative can view performance and commission information
Roles and Responsibilities:
• Monitoring ETL jobs on SQL Server Agent
• Fixing production issues based on error encountered in log tables
• Involved in Creation of tables, constraints and created stored procedures, table valued
functions, designing data ware house with star schema using T SQL
• Design and deployment of reports using SSRS from SQL Server Database (OLTP) and SQL
Server Analysis Services Database (OLAP).
• Creating test cases and deployment documents for the Elevations.
• Ensuring all the business logics to be in sync in SSIS