0% found this document useful (0 votes)
426 views

Snowflake CV 9

Kishore P has over 4 years of experience working with Snowflake, AWS, SQL, and data warehousing. He has extensive experience loading and transforming data between Snowflake, AWS S3, and relational databases. His skills include writing SQL queries, loading data using COPY commands, and implementing ETL pipelines. He has worked on multiple projects involving data analytics, reporting, and database management.

Uploaded by

Srilakshmi M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
426 views

Snowflake CV 9

Kishore P has over 4 years of experience working with Snowflake, AWS, SQL, and data warehousing. He has extensive experience loading and transforming data between Snowflake, AWS S3, and relational databases. His skills include writing SQL queries, loading data using COPY commands, and implementing ETL pipelines. He has worked on multiple projects involving data analytics, reporting, and database management.

Uploaded by

Srilakshmi M
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Kishore P

E-mail: kishire.snowflake@gmail.com Mobile: xxxxxxx

Professional Experience:
▪ Overall 4.2 Years of professional IT experience in this area of Data Warehousing, SQL and
Snowflake Data Cloud and AWS.
▪ Experience in Data loading into Snowflake Cloud Data warehouse from AWS S3
▪ Experience in Migrating data from Relational databases to snowflake
▪ Handling large and complex data sets in CSV, JSON and PARQUET file formats
▪ Experience in bulk loading and unloading data into Snowflake tables using COPY
command.
▪ Implemented SCD TYPE1 data pipelines using STREAMS and Tasks
▪ Used COPY/INSERT, PUT, GET commands for loading data into Snowflake tables from
internal/external stage
▪ Experience in writing SQL Queries in SnowSQL
▪ Experience in using Zero Copy Clone to replicate Lower environments like DEV and Stage
▪ Experience in accessing data from Time Travel.
▪ Implemented a POC on continuing data loading into Snowflake using Snowpipe
▪ Proficient in understanding business processes/requirements and translating into
technical requirements.
▪ Experience in using Multi Cluster Warehouse to handle concurrency issues
▪ Experience with performance tuning of Snowflake data warehouse with Query Profiler,
Caching and Virtual data warehouse scaling
▪ Experience in writing JavaScript and SQL based St
▪ Experience in working different areas of RDBMS, Data loading through SQL loader.
▪ Extensively worked in ETL process consisting of data transformation, data sourcing,
mapping, conversion and loading.
▪ Understanding customer requirements, analysis, design, development and
implementation into the system, gather and define business requirements and enhancing
business processes.
Technical Skills:
Cloud Data warehouse: Snowflake.

ETL Tool: Knowledge of Informatica

Databases: Sql, My sql, Oracle 11g/10g.

Development Tools: SQL Developer.

Programming Languages: Oracle SQL, PL/SQL.


Operating Systems: Windows.
Cloud Solutions architect: AWS S3 Bucket.

Experience Summary:
▪ Working in Lavens as a Snowflake and AWS from Oct 2021 to Till-date.
▪ Worked at HCL as a Snowflake and AWS from May 2020 to Sep 2021
▪ Worked as IBM as a SQL Developer from June 2013 to May 2017.

Project: Adidas B2B- Analytics and Reporting


Environment: Snowflake Cloud Data warehouse, AWS S3

Roles and Responsibility Roles:


▪ Work closely with Product Specialist and Project Managers to understand and maintain
focus on their analytical needs, including identifying critical metrics and KPIs, and deliver
actionable insights to relevant decision-maker.
▪ Supporting the data warehouse in identifying Source system and revising reporting
requirements.
▪ Responsible for all activities related to the development, implementation and support of
ETL processes for large scale Snowflake cloud data warehouse.
▪ Bulk loading from external stage (AWS S3) to internal stage (snowflake) using COPY
command.
▪ Loading data into Snowflake tables from internal stage and on local machine.
▪ Used import and export from internal stage (Snowflake) vs external stage (S3 Bucket).
▪ Writing complex Snow SQL scripts in Snowflake cloud data warehouse to Business Analysis
and reporting.
▪ Responsible for all activities related to the development, implementation and support of
ETL processes for large scale data warehouses using Snowflake cloud Data warehouse.
▪ Create task flows to automate the data loading/unloading to/from Snowflake data
warehouse and AWS S3
▪ Perform troubleshooting analysis and resolution of critical issues
▪ Writing complex SQL scripts in Snowflake data warehouse to support Business reporting
▪ Involved in data analysis and handling ad-hoc requests by interacting with business
analysts, clients and resolve the issues as part of production support
▪ Preparing weekly, monthly and quarterly status reports to update customer about team’s
accomplishments.
▪ Conducting training and Knowledge sharing sessions for Testing team and Production
support teams

Project2: Bank Database Display


Environment: SQL, MYSQL, SQL Developer
Project performs all the banking related operations. The project is done using Python and
MySQL. There are multiple nested menus, and users can insert, display, update, withdraw, and
deposit from the account. This is an interactive project and introduces the reader to the
concept of cursors. You can view the project on this
Responsibilities:

▪ Have done data profiling and data quality checks


▪ Supporting for data model review and identify gaps if any to enhance data model.
▪ Worked on .Bat files for executing SQL files.
▪ Worked on import and export to load the dump files to the schemas.
▪ Create and maintain complex PL/SQL programs like Packages, Functions, and
Triggers.
▪ Gathering requirements and provide estimation to the onshore team.
▪ Wrote dynamic SQL queries to create database objects dynamically.
▪ Tuning the SQL Queries and implementing the new available techniques of Oracle DB.
Client: Ernst & Young Global Limited
Project 3: Cardinal Health

Operating System: Windows 10, UNIX, Linux.

Environment: SQL, MYSQL, SQL Developer

Cardinal Health, Inc. is an American multinational health care services company, and the
14th highest revenue generating company in the United States. Its headquarters are based in
Dublin, Ohio and Dublin, Ireland (EMEA). The company specializes in the distribution of
pharmaceuticals and medical products, serving more than 100,000 locations. The company also
manufactures medical and surgical products, including gloves, surgical apparel, and fluid
management products. In addition, it operates the largest network of radio pharmacies in the U.S.
Cardinal Health provides medical products to over 75 percent of hospitals in the United States.

Responsibilities:
▪ Designing database tables and structures.
▪ Creating views, functions and stored procedures.
▪ Writing optimized SQL queries for integration with other applications.
▪ Creating database triggers for use in automation.
▪ Maintaining data quality and overseeing database security.

Professional Qualification:

▪ Masters of Information and Communication Technology from Latrobe university


Melbourne Australia

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy