0% found this document useful (0 votes)
119 views5 pages

KEY Profession A L: Awards

Sampada Vinodrao Babhulkar is a Python Data Scientist looking for a challenging role utilizing skills in Python, Data Science, Machine Learning, AI, Django REST frameworks. She has over 2 years of experience as a Python Data Scientist at Virtusa Technology. She has a B.E. in Information Technology from Datta Meghe College of Engineering and Technology.

Uploaded by

Chinmaya Sahoo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
119 views5 pages

KEY Profession A L: Awards

Sampada Vinodrao Babhulkar is a Python Data Scientist looking for a challenging role utilizing skills in Python, Data Science, Machine Learning, AI, Django REST frameworks. She has over 2 years of experience as a Python Data Scientist at Virtusa Technology. She has a B.E. in Information Technology from Datta Meghe College of Engineering and Technology.

Uploaded by

Chinmaya Sahoo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Sampada Vinodrao Babhulkar

Python D a t a s c i e n ti s t
V i r t u s a T e c h n o l o g y - h tt p s : / / w w w . v i r t u s a . c o m /

CONTACT PROFILE EDUCATION

7822995260 Looking for a challenging role in a reputable


organization to utilize my Python Data science /
 B.E. - Information technology
Machine learning /AI /Django REST Web frame work Datta meghe college of
Babhulkarsampada06@gmail.com
skills for the growth of the organization as well as to engineering technology &
Harriram Nagar, ward no. 11, enhance my knowledge about new and emerging trends research (DMIETR),
Pulgaon, taluka - Deoli, District - in the IT sector. Highly motivated individual and a Sawangi, Wardha
certified Data science with strong PySpark /statistical  University - Rashtrasant tukdoji
Wardha, Maharashtra, 442302. maharaj Nagpur University.
Analysis skills, attention to detail, and a solid Coding  Passing year - 2020
background looking to obtain a position of ML/AI/Data
Engineer/Web Developer Specialist .

KEY PROFESSION A L
.
Python | Django |AWS |Data Science | Virtusa - https://www.virtusa.com/- Python Data Science – 2020 May - Present
Machine Learning | AI | PySpark |
 Overall 2 Years of experience - Python | Data Science | Machine Learning |
AWARDS Django REST Web Framework |ML Automation
 Verticals / System / Component / Module – Telecom , Payment Gateway ,
Supply Chain
Awards and Achievement:
 Client : Rogers Communications, Canada | WorldPay , United Kingdom
 Awarded with Excellent Performer
from Virtusa for brilliant Experience Summary - Technology ,Tools and platform
performance .
 Received Award from client- Major Tools , Associated Technologies & Environment
World Pay,UK for extra ordinary  Unstructured Database Used – Mongo DB
performance for successfully  UI Application Programming Language - HTML, CSS, JavaScript, JQuery and Bootstrap.
delivery of ‘Omni-Channel Analytics  Web Scraping Library : Beautiful Soup 4
for Payment Terminal ‘Project  Web Framework involvement : Django REST
 BI Tool used - Tableau
 Parsing XML - xml Schema
C E R T I F I C A T IONS
Process worked with: DevOps + Agile + V- Model

Software Programming Language Experience:

Python |Java | C++ |Java Script |CSS |HTML |Boot Strap |J Query

Cloud Environment worked with : AWS EC2 Server

Important Django REST Libraries worked :

Django-import-export | Django-crispy-forms |Django-compressor |Easy-thumbnails |Celery |


Python-decouple |Django-allauth |Django-extensions

Important Data Science Libraries worked :

Matplotlib | Numpy | Pandas | Scikit Learn | Tensorflow |Seaborn |Keras ||Scipy


PROFESSIONAL WORK SUMMARY

 Highly experienced Data Scientist with over 3+ years’ experience in Data Extraction, Data Modelling, Data Wrangling, Statistical Modeling, Data
Mining, Machine Learning and Data Visualization.
 Expertise in transforming business resources and requirements into manageable data formats and analytical models, designing algorithms, building
models, developing data mining and reporting solutions that scale across a massive volume of structured and unstructured data.
 Proficient in managing entire data science project life cycle and actively involved in all the phases of project life cycle including data acquisition, data
cleaning, data engineering, features scaling, features engineering, statistical modeling, testing and validation and data visualization.
 Proficient in Machine Learning algorithm and Predictive Modeling including Regression Models, Decision Tree, Random Forests, Sentiment Analysis,
Naïve Bayes Classifier, SVM, Ensemble Models.
 Collaborated with data engineers and operation team to implement ETL process, wrote and optimized SQL queries to perform data extraction to fit
the analytical requirements.
 Explored and analyzed the customer specific features by using Spark SQL.
 Performed univariate and multivariate analysis on the data to identify any underlying pattern in the data and associations between the variables.
 Performed data imputation using Scikit-learn package in Python.
 Worked on data cleaning and ensured data quality, consistency, integrity using Pandas, Numpy.
 Used SSIS to create ETL packages to Validate, Extract, Transform and Load data into Data Warehouse and Data Mart.
 Developed and implemented predictive models using machine learning algorithms such as linear regression, classification, multivariate regression,
Naive Bayes, Random Forests, K-means clustering, KNN, PCA and regularization for data analysis.
 Wrote complex Spark SQL queries for data analysis to meet business requirement.
 Analyze and Prepare data, identify the patterns on dataset by applying historical models. Collaborating with Senior Data Scientists for understanding
of data
 Perform data manipulation, data preparation, normalization, and predictive modelling. Improve efficiency and accuracy by evaluating model in
Python
 Familiar with various ML algorithm such as Linear Regression , Logistic Regression ,KNN ,K-Means ,Naïve Bayes ,Mean, Median, Mode ,Decision
Tree ,Random Forest, Support Vector Machine ,Principal Component Analysis ,NLP ,XG Boost
 Used various types of Core Data Science/ML Libraries /Packages such as PySpark , Pandas , Numpy , Seaborn , Matplotlib ,Scikit Learn , Keras,
Tensorflow, Scipy , Statsmodels , Plotly
 This project was focused on customer segmentation based on machine learning and statistical modelling effort including building predictive models
and generate data products to support customer segmentation

Django REST Web Framework Task

 Designed front end and backend of the application utilizing Python on Django Web Framework.
 Analyzed the requirements and designed the flow of task using flow charts and dependently designed flow between pages of the UI.
 Responsible for creating of website functionality with JavaScript, HTML, CSS .
 For the development of the user interface of the website used HTML, CSS, Java Script and AJAX
 Used various types of Django libraries such as Django-rest-framework ,Django-cors-headers , Django-debug-toolbar , Django-extensions ,Sentry-
sdk , Django-allauth , Django-filter , Django-import-export
 Experience in developing views and templates with Python and Django's view controller and templating language to create a user-friendly
website interface.
 To update a portion of a webpage used JavaScript and JSON.
 Expertise in developing consumer based features and applications with Python, Django, HTML, Behavior Driven Development (BDD) and pair
based programming.
 Experience in the required XML Schema documents and implemented the framework for parsing XML documents.
 Modify the existing Python/Django modules to deliver certain format of data.
 Written Python scripts to parse JSON documents and load the data in database.
 Performed data analysis using Pandas as API to convert data into tabular format
 Used google API's and created visualizations such as pie charts, donut charts and displayed in the web application
 For the development of the web applications utilized CSS and Bootstrap.
 To share the information across the applications used pickle/unpickle in python.
 Utilized Python libraries like Numpy and matplotlib for generating graphical reports.
 Build SQL queries for performing various CRUD operations like create, update, read and delete.
 Experienced with GIT version control and deployed the project to Heroku.

Technical Skills :

 Core Libraries worked: Matplotlib | Numpy | Pandas | Scikit Learn |Seaborn


 Process worked with: DevOps + Agile + V-Model
 Machine Learning: Logistics Regression, Linear Regression, K-means
 Analytic/Predictive modeling Tools: Jupyter, Anaconda
 Visualization Tools: Tableau, Python – Matplotlib
 ETL: Pyetl
 Programming Language - Python, SQL, Java
 Tools: SQL Developer
 Web Designing :- HTML, CSS, JavaScript
 Web server :- WSGI, Apache.
 Database Skills :- MySQL , Oracle.
 Versioning Tools :- GIT
 Build Tools :- PyBuilder
 RDBMS :- Oracle 10 g
 Unstructured database :- Mongo DB
 Continuous Integration Tool :- Jenkins
 Data Modeling Tools :- Oracle Designer,
 Framework Software :- Django, Flask
 Defect Reporting Tool :- HPALM
 Web Service :- REST –JSON, DRF (Django Rest Framework)
 Development Tools :- PyCharm, Notepad ++
 OS and Networking :- Windows 7, Windows 10, Ubuntu/Linux 18.2
 Certified Python Programme Professional –CPPP

Project sequence-1

Project Name : Wireless Communication & Data Center Router –Enterprise Ethernet
Client : Rogers Communications, Canada
System/Component : Telecom OSS ,CRM
Technology : Python, Django, Oracle, GIT, SVN, Rest – Web service, Pandas, etl, Statistical Analysis
Roles : Python Data analyst / Python web Development

Project Detail Description:

It is a high performance modular and fully redundant aggregation router, designed to enable high quality network service delivery for RAN and
fixed/mobile converged metro aggregation networks. In its category, it sets a new benchmark for port density by scaling up to 144x10G and 24x100G
interfaces and offering up to 2.7Tbps switching capacity in a space efficient 5RU chassis with front access for all field replaceable units allowing an overall
lower OPEX. It supports VPN services over IP/MPLS networks, service provider SDN, service exposure using NETCONF/YANG, extensive quality of service
and precise synchronization features. The Router 6274 has strong security features such as IPSec and vendor software authentication for ubiquitous
deployment. With 2.7Tbps of switching capacity, the Router 6274 delivers performance needed to fully support LTE, LTE Advanced, 5G, Fixed Mobile
Convergence and Enterprise applications. The Router 6274 is part of the Ericsson Router 6000 Series, a radio integrated and subscriber aware IP transport
family of products. The Router 6000 offers a range of high-performance routers with resiliency features and form factors optimized for the various needs of
metro and backhaul networks.This equipment is an advanced 4G/5G access router and pre-aggregation router with 100Gb forwarding capacity

Responsibilities :

 Experience in developing entire frontend and backend modules using Python on Django Web Framework.
 Experience in working at various phases of project such as analysis, design, development, and testing.
 Using Django Framework model, implemented MVC architecture and developed web applications with superb interface.
 Created user interface of website using Python, HTML5, CSS, JSON and JQuery. Used CSS bootstrap framework for developing web
application.
 Developed the business logic in views for the URLs created and linked the webpages to functions in views to show the output to the end-user
or to store information from the website into the database
 Worked on Django REST framework as it is much faster to read data and it can be cached. REST allows more formats than SOAP and gives
better support for browser clients as it supports JSON.
 Written scripts to import data, export data and data modeling.
 Worked on Django ORM API to create and insert data into the tables and access the database.
 Extensive experience in using the python packages such as NumPy, SciPy, Pandas, Beautiful Soap, Pickle and OS.
 Built multifunction readmission reports using python pandas and Django frame work
 Involved in Preparing Low level Design of Application, To take part in software and architectural development activities
 Involved in designing and preparation of call flows with usability services.
 Perform data exploratory analysis using Matplotlib
 Collected historical data and third party data from different data source,Improved Operation activities. Used Linear & Logistic Regression
 Understand and Analyse Customer requirements and Business logic ,Perform data cleansing, data imputation and data preparation using
Scikit Learn and Numpy.
 Conduct software analysis, programming, Unit and White box testing and debugging
 Identifying production and non-production application issues,Ensuring designs comply with specifications, Transforming requirements into
stipulations
 Support continuous improvement, investigating alternatives and technologies, Presenting for architectural review
 Managing Python application development, Develop, test, implement and maintain application software
 Validated already developed python reports. Fixed the identified bugs and re-deployed the same.
 Recommend changes to improve established python application processes
 Develop technical designs for application development
 Develop application code for python programs, involved in client interaction to sort out the Requirement issue
 Used hpalm to Handle Defect management Process, Implementing a Working Timeline and Deadline Adherence
 Create the Reporting and share with top management
Project-Sequence 2

Project Title : Dynamic Route Optimization


Client: : Rogers Communications, Canada
System/Component : CRM , Ecommerce Analytics ,Payment
Technology : Python , ETL ,Django Rest
Role : Python Data analyst / Python web Development

Project Detail Description:

Inventory management is one of the most typical machine learning use cases in the supply chain. Machine learning can help solve the problem of under- or
over-stocking. Based on the data that can be sourced from many areas like the marketplace environment, seasonal trends,
promotions, sales, and historic analysis, with ML you can predict the demand growth. For the forecast to be accurate, user need to have a wide range of
data.When the number of data sets is insufficient for the effective analysis, machine learning offers several methods of how to solve the problemData
augmentation , Incremental learning, Reinforcement learningVarious modules : Inventory management , Warehouse Management,Logistics &
transportation, Production, Chatbots , Customer service, Security, Business 

Roles and Responsibilities :

 Involved in requirement analysis, design, estimation and testing of the assigned tasks in open stack WITH BA
 Interpreting data, analysing results using statistical techniques
 Developing and implementing data analyses, data collection systems and other strategies that optimize statistical efficiency and quality
 Design python script with respect to ensure functionalities meets customer requirements
 Acquiring data from primary or secondary data sources and maintaining databases
 Work with stakeholders to determine how to use business data for valuable business solutions
 Search for ways to get new data sources and assess their accuracy
 Browse and analyze enterprise databases to simplify and improve product development, marketing techniques, and business processes
 Create custom data models and algorithms
 Use predictive models to improve customer experience, ad targeting, revenue generation, and more
 Develop the organization’s test model quality and A/B testing framework
 Coordinate with various technical/functional teams to implement models and monitor results
 Develop processes, techniques, and tools to analyze and monitor model performance while ensuring data accuracy

Project-Sequence 3

Project Name : Omni-Channel Analytics for Payment Terminal


Client : Worldpay, United Kingdom
System/Component : Integrated Payment System,
Technology : Python , Django , Oracle ,GIT ,SVN ,Rest – Web service, Pandas, ETL, Statistical Analysis
Roles : Python Data analyst / Python web Development

Project Detail Description:

Improve your sales conversion rate and marketing ROI by analyzing and identifying patterns through user payments data, such as customers’ preferred
payment methods, high-performing stores or the payments value and volumes during normal and peak seasons.
This solution helps for Translate complex data sets into meaningful data visualization, Customize dashboards most relevant for sales, marketing and
operations. Create custom alerts for analyzing critical KPIs, such as transaction volumes, denial rates, stand-ins, reversals and more,Improve adoption of
new payment methods and channels with pattern and trend analysis.

The ACH system consists of computers working together to process payments automatically. There’s no need to manually handle payments (on your
part or the biller’s). ACH is a “batch” processing system that handles millions of payments at the end of the day.
Clearing house: The network uses two central “clearing houses.” All requests run through either The Federal Reserve or
The Clearing House. This allows for efficient matching and processing among numerous financial institutions.
Makes money transfers easy with minimal labor and cost.
ACH payments are electronic payments that are created when the customer gives an
originating institution, corporation, or other customer (originator) authorization to debit directly from the
customer's checking or saving account for the purpose of bill payment. Customers who choose ACH payment must
first authorize you to debit their bank account for the amount due. Authorization must conform to the requirements of
the ACH Operating Rules and must be either written and signed, or electronically displayed. See www.nacha.org for more information.
This ACH Payment supports the following payment types:
Electronic Check - TEL and WEB payments, Check Conversion - POP, ARC, and RCK payments

Revenue & pricing. Increase revenue and set pricing by tying pricing to your pricing and product strategies. We can help select
and implement a pricing and billing solution to execute your strategy.
Cost of service. Capture as-is costs and compare with bank and industry benchmarks to help drive your cost management program
Cost of quality. Optimize costs by enhancing straight through processing to minimize manual intervention for repairs and investigations,
enabling the realization of ‘right first time.’
Fraud & risk management. Reduce the incidence of fraud by combining control process definition with skills in leading industry
platforms for risk management.
Information handling. Payment processing revenues continue to fall but banks still must manage escalating amounts of
payment information. Building a body of knowledge from customer data points can help structure value-added services (VAS) and provide
opportunities for cross selling.Business Benefits:
Allows employee payments without printing checks, stuffing envelopes or paying for postage
Facilitates regular customer payments without having to transport actual paper checks to the bank,
Has lower fees than credit card payments ,Electronic process makes vendor and supplier payments easier and faster, while keeping electronic records of all
transactions

Roles and Responsibilities :

 Understand and Analyse Customer requirements and Business logic


 Built multifunction readmission reports using python pandas and Django frame work
 Involved in Preparing Low level Design of Application
 To take part in software and architectural development activities
 Involved in designing and preparation of call flows with usability services.
 Perform data exploratory analysis using Matplotlib, Collected historical data and third party data from different data source
 Improved Operation activities. Used Linear & Logistic Regression, perform data cleansing, data imputation and data preparation using Scikit
Learn and Numpy.
 Conduct software analysis, programming, Unit and White box testing and debugging
 Identifying production and non-production application issues, ensuring designs comply with specifications, Transforming requirements into
stipulations
 Support continuous improvement, investigating alternatives and technologies, Presenting for architectural review
 Managing Python application development, Develop, test, implement and maintain application software
 Validated already developed python reports. Fixed the identified bugs and re-deployed the same.
 Recommend changes to improve established python application processes, Develop technical designs for application development
 Develop application code for python programs, involved in client interaction to sort out the Requirement issue
 Used hpalm to Handle Defect management Process, Implementing a Working Timeline and Deadline Adherence
 Create the Reporting and share with top management

PERSONAL DETAILS:

 Father name :- Vinodrao R. Babhulkar


 Mother name :- Mangala Vinodrao Babhulkar
 Current address: - Harriram Nagar, ward no. 11, Pulgaon, taluka - Deoli, District - Wardha, Maharashtra, 442302 .
 Date of birth :- 13/06/1995
 Marital status :- Single
 Language :- Marathi , Hindi , English

Date:-

Regards

Sampada Vinodrao Babhulkar

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy