0% found this document useful (0 votes)
23 views4 pages

Vipal P

Vipal Kumari Patel is a Data Sciences professional with over 5 years of experience, specializing in Python, AWS services, and data analysis. She has a strong background in software development, project management in Agile environments, and automation using various tools and frameworks. Her technical skills include proficiency in SQL, web application development, and data science techniques, along with experience in leading teams and collaborating with stakeholders.

Uploaded by

babjimad22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views4 pages

Vipal P

Vipal Kumari Patel is a Data Sciences professional with over 5 years of experience, specializing in Python, AWS services, and data analysis. She has a strong background in software development, project management in Agile environments, and automation using various tools and frameworks. Her technical skills include proficiency in SQL, web application development, and data science techniques, along with experience in leading teams and collaborating with stakeholders.

Uploaded by

babjimad22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Vipal kumari Patel

Email - vipalkumaripython@gmail.com
Contact no - 214 446 6199

Professional Summary:

 5+ years of experience working in the field of Data Sciences having exceptional technical, analytical,
decision-making, problem-solving skills.
 Experience in Site Reliability operations (SRE), Kafka, AWS Service AWSs like Amazon EC2, Amazon S3,
Amazon Step Functions, Amazon EMR and Amazon SQS.
 High business Analysis experience with in-depth knowledge of business processes in various industries.
 Ability to work on multiple projects simultaneously to deliver content and brand strategies within tight
deadlines.
 A thorough understanding of the Software Development Life Cycle (SDLC).
 Experienced with the project team in the Agile/Scrum environment.
 Advanced user of MS Project and MS Office Applications.
 Proficiency with Microsoft Office Suite (Excel, Word, Visio, PowerPoint, and SharePoint), with an aptitude
for learning proprietary software and databases.
 Can work well under pressure with solid time management skills.
 Excellent written and verbal communication skills.
 Experienced in developing web-based applications using Python, Django, Flask, C++, XML, CSS, HTML, and
JQuery.
 Expertise in usage of Python Framework like Flask, Django for developing web applications from past 5
Years.
 Experience in Java, JavaScript, and Python software development environments using Agile methodology.
 Extensively used Python's multiple data science packages like Pandas, NumPy, matplotlib,
 Seaborn, SciPy, Scikit-learn and NLTK.
 Installed application on AWS EC2 instances and configured the storage on S3 buckets.
 Configured Elastic Load Balancers (ELB) with EC2 Auto scaling groups. Created monitors, alarms, and
notifications for EC2 hosts using Cloud Watch.
 Involved in deploying systems on Amazon Web Services Infrastructure services EC2, S3, SQS, Cloud
Formation.
 Maintained and developed Docker images for a tech stack including Cassandra, Kafka, Apache, and several
in house
 Experiences with File Handling and SSH (Secure Shell).
 Valuable Experience in Linux Bash scripting and following PEP Guidelines in Python.
 Good experience in performing operations such as CRUD operations and writing complex queries with
Oracle 10g/11g.
 Proficient in writing SQL Queries, Stored procedures, functions, packages, Cursors, tables, views, triggers
using relational databases like Oracle, MSSQL Server database.

Technical Skills:
Programming Language : Python, Java, SQL, HTML, PHP, JavaScript

Tools : Tableau, MS Excel, Power BI, MS Office, Jupyter, Git, AWS


Databases : SQL Server, MySQL, Oracle,MongoDB, DynamoDB, SSMS

Data Sciences : NumPy, Pandas, Scikit-Learn, Matplotlib, TensorFlow, Keras,


Deep Learning Statistical, Econometric & Machine C, HTML, CSS, JavaScript.
Statistical, Econometric & Machine Learning

Techniques : Descriptive Statistics, Regression, Time series, PanelData Regression,


Bayesian Methods, Clustering, Dimensionality Reduction, Market Basket
Analysis, Logit, Probit and Tobit Models, Ensemble Methods, Bagging, Boosting
and Pasting methods, Perceptron, CNN and RNN, Monte-Carlo Simulations, NLP

Education : High school in Computer Science Engineering

Certifications : Specially trained in python, Java and SQL programming language from Durga Soft

Work Experience:
Client: Charter Communications Dec 2020 – Present

Role: Python/AWS

Responsibilities:

 working in developing new scripts and writing automation scripts to reduce manual work
with existing frameworks.
 Consider some of the gaps that need manual intervention in processing of the claims data
and write automation scripts to bridge these gaps using python scripts.
 chunking the data to convert them from larger data sets to smaller chunks using python
scripts which will be useful for faster data processing.
 Involved in running and generating reports for claims data. And organizing demos for
business people and clients to show them the reports.
 Work in agile methodology and collaborate with customers and my team to figure out
ineffective methods or practices.
 Automation of running AWS Step functions to trigger Lambda’s and EMR’s.
 Running AWS Glue jobs to run Big Data queries like PySpark to get the datasets processing
and worked as Site Reliability operations at onsite.
 Invoking EC2’S , EMR and Glue jobs using AWS Cloud formation scripts.
 Wrote python script to generate workflow to automate AWS S3 and managing datasets from
PostgreSQL.
 Automated nightly build to run quality control using Python with BOTO3 library to make sure
pipeline does not fail.
 performing as a scrum master to the team to gather the requirements from stakeholders
and businesspeople to develop timelines to reach goals for assigned tickets. And monitor the
development of these tickets to make sure that the team meets the deadlines.
 Work on web services tools for quality optimization and Performance improvement of the
framework that begin used for processing the claim's data.
 Leading the team with research and designing new workflows in AWS pipelines and
architecting the required business documents for the team.
 working with different databases to extract and transform claims data.
 Writing SQL, Pyspark queries to make sure the data which we process is coming in is without
any discrepancy.
 worked on BOTO3 extensively to automate AWS services using Python.
 working on cloud services to enhance computation capacity, Kafka, Micro services, Spring
boot, Java.
 Working on secure Databases and writing SQL queries in tools like AWS Glue and AWS
Athena which provide necessary security for processing the data securely.
 working to optimize and enhance the performance of cloud based virtual machines like AWS
EMR and EC2’s.
 actively working with testing for quality checks in the automation and development scripts.

Role: Software Engineer jun 2018–Aug 2020

Client: Keyavi Data, Las Vegas, NV

Responsibilities:

● Working as a Python Developer in AlgonoX for the Product ACE (AlgonoX Cognitive Engine)
automates invoice processing using top-notch technologies.

● Member of developer’s team working on a new product - ACE BUILDER, a drag-and-drop


processes automation framework that builds the UI and functional micro-service components
like document processing, extraction, transformations, and predictions.

● Been a good Team Player, supporting fellow new team members on delivering Regular KT.

● Will be implementing new functions which simplify the work of other developers.

● Collaborate with Front-end developers to integrate user-facing elements with server-side logic.

● Have good experience in analyzing business logic and provide efficient solutions.

● Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.

● Perform different forms of testing to enhance the product's reliability and debug to introduce
new features.

● Good Knowledge in debugging and fixing UAT issues and Support Production.

● Involved in Designing and implementing frameworks that automate workflows.

● Played a prominent role in delivering solutions to business logics for various clients.

Role: Software Engineer Feb 2017 -May 2018

Valley Bank, New York, NY

Responsibilities:
● The analyzed data set has more than 70 attributes by performing Exploratory Data Analysis, to
identify key performance indicators.

● Automated the report pipeline and build dashboards and evaluation reports using python.

● Gather all of the client's expectations from the BA team and create flow charts.

● Responsible for maintaining the integrity of the SQL database and reporting any issues to the
database architect.

● Modified and maintained SQL Server stored procedures, views, and ad-hoc queries used in the
(SEO) process.

● Collaborated with multi-functional teams of marketing, senior executives to identify, recommend


quality improvement of the project.

Role: Intern Sep 2016- Jan 2017

TCS, Chicago, Il

Responsibilities:

● I gained experience in both testing and production.

● learned how to code professionally.

● Developed a POC using Python.

● Used to complete all of the tasks assigned by the instructor within the time frame specified by
the instructor.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy