Ramesh Yamdra
Ramesh Yamdra
Ram9mca@hotmail.com
Ph-860-884-1557
Artificial Intelligence (AI) Engineer/GEN AI Engineer with strong proficiency in developing and implementing
AI solutions. Equipped with a strong foundation in machine learning algorithms, deep learning frameworks and
data analysis. Designing end- to end AI solutions and optimizing for performance, integrating AI models into
existing systems.
TECHNICIAL SKILLS:
Machine Learning: Neural Networks, Decision Trees, SVM, NLP, Reinforcement Learning and MLFlow, GPT4
and LAMA 2, PyTorch, Scikit-learn, Numpy, SciPy,RAG and HuggingFace.
Programming Languages: Python (TensorFlow, Kerans), R,Java, C++, SQL and Microsoft Azure
Data Processing: Numpy, Pandas, Hive, Impala, Hadoop, Spark.
Other technical Skills: GitHub, Docker, Kubernetes, RESTful APIs, GraphQL, Linux
Data Formats: ANSI X12 5010/4010, JSON and XML, TypeScript, React
Cloud Technologies: AWS, EC2 and S3, Web and Mobile Apps, JIRA, Jenkins CICD, AWS SageMaker and Jupyter
Notebooks
Data Bases: PostgreSQL
CERTIFICATIONS:
WORK EXPERINECE:
1
Built a custom chatbot for enterprise clients which led to a 20% decrease in average help desk ticket
resolution time
Collaborating with software developers to integrate AI solutions into existing systems and
applications.
Ensuring seamless integration between AI models and other software components.
Supported the documentation of code, models and methodologies.
Developed and maintained AI-based systems and applications, improving operational efficiency by
automating manual processes and reducing errors by 25%.
Worked on LLMs, RAG and Agent/Tools using Python, TensorFlow, Azure ML PySpark, R, SQL and
Azure cognitive Servies.
Developed and maintained AI-based databases, implementing data cleaning and preprocessing
techniques that improved data quality by 30% and reduced data retrieval time by 40%.
Written code using Python and analyzing issues for the AI Models
Built a custom chatbot for enterprise clients which led to a 20% decrease in average help desk ticket
resolution time
Worked on large-scale build, release, continuous integration/continuous deployment (CI/CD), and
observability methods
Prepared Signoff documents and User manuals and conducted End User training.
EDI Data Engineer, Healthcare Client, Washington DC Apr 2017 to Jan 2021
Analyzing all EDI X12 Files and recommend improvements to processes and coordinates with
internal/external customers.
Analyzing Error EDI Transaction and log files for any issues inbound EDI and OUTBOUND EDI Files Real
time and Batch X12 EDI Files
validated data at all stages of the ETL process like Source to target data mapping.
Tested various Ab Initio graphs which facilitate Daily, Weekly & Monthly Loading of Data.
Performed Verification, Validation, and Transformations on the Input data before loading into target
database.
Written Complex SQL Queries to validate the Source and target data validations.
Executed ETL Test Cases with respect to Source and Target database tables.
Validating the rejected records in Ab Initio jobs.
Performed OnDemand Actual reports validation like Adjudicated claims, denied claims reports etc.
EDI Data Engineer, Enrollment (834) Federal and State Exchanges Project,
Health Plan Services, Tampa Jul 2015 to Apr 2017
Involved Agile Scrum project architecture review meetings, stories time estimation, sprint reviews and
project stories and tasks, daily stand-up meetings, Project demos, backlog grooming meetings. The
existing systems requirements meeting, Production and Release management meetings, Data
Migration issues and Production issues meetings.
2
Developed and Tested Various Reusable Data Warehouse ETL Transformations which facilitate Daily,
Weekly & Monthly Loading of Data.
process and documentation of Test Plans, Test Cases, Test Procedures, Regression testing, Reporting
and Tracking the defects.
Tested and Validated Enrollment (834), Claims (837) Web Services using SOUPUI Tool.
Worked Extensively with Inbound 834 and 820 Federal and State exchanges and Claim 837 Edi
Transactions.
Prepared complex SQL Queries and Analyze the Data warehousing- ETL Processing (Extraction,
Transformation and Loading) processing.
Monitor all error logs file processing and perform troubleshoot on processes to resolve all transaction
file issues and perform tests on same.
Prepared test scripts based on Functional requirement and uploaded test scripts into HPQC.
Used SQL to test various reports and ETL load jobs in development, QA and production environment.
Sr. EDI Data Engineer, HIPPA 5010-EDI FLOW PROCESSING, EMI HEALTH, UT Dec 2013 to July 2015
Involved in Implementing Agile (Scrum) methodologies for the complex project modules.
Worked as ETL Tester responsible for the requirements gathering / ETL Analysis, ETL Testing and
designing of the flow and the logic for the Data warehouse project.
Tested mapping for extracting, cleansing, transforming, integrating, and loading data using Informatica
ETL Tool.
Validated ANSI X12 EDI Structure/ EDI Format Mapping documents and validated EDI transactions
837P, 837D, 837I, 834,820,276/277, 270/271, 999/997 files Inbound and outbound EDI Transactions
Written Unix Schell scripts. Worked closely with DBA team to regularly monitor the systems for bottle
necks and implement appropriate solutions.
Monitors various application backend processes and takes the necessary action upon failures.
Resolving the day-to-day issues reported by application/department users in a production support
environment and Preparing test plans, test data, error logs and end user documents to cover all the
business requirements.
Involved in the various stages of project development such as system analysis, system design,
implementation, and system integration/Data Migration with testing.
Participated in project Functional Requirement Specification (FRS) and Business Requirement
Specification (BRS).
Worked Extensively with Inbound 837 I and 837 P, 835s(Out bounds) claims processing systems
Responsible for basic and advanced Transformations like Sorter, Router, Filter, Joiners, Sequence
generator, Aggregator, Expression, Mapplets, Union and lookup in mappings.
Designed test scripts data load and developed UNIX shell scripts as part of the ETL process, automated
the process of loading, pulling the data.
Performed data analysis and data profiling using SQL and Informatica Data Explorer on various sources
systems including Oracle and Teradata. Involved in writing SQL script.
EDI DATA Engineer, HIPAA 5010 Implementation, BCBS NE Jan 2012 to July 2013
Involved in the various stages of project development such as system analysis, system design,
implementation, and system integration/Data Migration with testing.
3
Participated in project Functional Requirement Specification (FRS) and Business Requirement
Specification (BRS).
Prepared data validation scripts for HNS Data Mart (Provider dimension, QE members dimension,
other payers’ dimensions, claims dimensions, and adjudications facts tables) validated source table
and target tables ODS’S.
Worked Extensively with Inbound 837 I and 837 P, 835s (Out bounds) claims processing systems.
Written test cases for Tidal job scheduling, mapping rules and validated Business Intelligence reports.
Involved automated processing activities. Prepared Signoff documents and User manuals.
EDI Data Analyst, FHCP HIPAA 5010, Florida Health Care Plan (BCBS) Jan 2011 to Dec 2011
Implemented and followed the AGILE methodology for software development and used story points to
requirements.
Tracking and prioritizing defects and collaborating with developers to resolve test bugs. Providing
progress and status reports to the Project Manager.
Performing data verification/validation, developing test procedures, test scenarios, test plans, test
cases and test scripts.
valdiated test data for the transactions sets 837 I, P&D, 834,270,276 278, flat files Transaction sets
based on HIPAA Standards and Blue Exchange requirements LRM 4.
Tested the Application process/flat files for both before data validation and after data validation
process.
Written and validated based on business rules for the day-to-day master/Detail report business
intelligence reports.
Communicated and Discussed with Developers about the status of each Data Quality Issues
Written SQL Queries, Xpath Queries, Joins, Data Sets and Store Procedures and validated back-end
applications.
Used Quality Standards for defect tracking and produced detailed defect reports, Pass-Fail reports and
Comparison Charts for QA status meetings.
EDI Analyst, ECI HIPAA 5010, BCBS LA, Aug 2010 to Dec 2010
Involved in the various stages of project development such as system analysis, system design,
implementation, and system integration with testing.
Implemented and followed the AGILE methodology for software development and used story points
to requirements.
Wrote Test cases for Enterprise Data Warehousing (EDW) Tables and Data Mart Staging Tables.
Prepared test scripts and validated external vendors extracted flat file validation based on mapping
document.
Tested schemas with preparing XML/XSLT Paring Schemas using Mapping designer.
validated Payers and Providers external application. Prepared test data for the transactions sets 837
I, P&D, 834,270,276 278, flat files Transaction sets based on HIPAA Standards and Blue Exchange
requirements LRM 4.
Worked on Documentation, User Manuals and Non-functional requirements like system monitoring,
alerting, and performance measurement.
Data Analyst, Acute Remediation Tracking Tool (ARTT), Pfizer, Groton, Connecticut, Jan 2010- June 2010
Involved in the various stages of project development such as system analysis, system design,
implementation, and system integration with testing.
Developed technical documentations and visualizations to ensure data insight and impactful decision-
making
4
Designed and coded complex SQL queries to produce actionable insights from internal databases,
increasing data analysis productivity by 25%
Prepared Unit test for Master pages/Detail web pages, Share Point document libraries, Picture
libraries, Custom web parts, custom features etc.
Prepare test cases and testing custom workflows and GUI Interfaces with backend databases.
Worked on Documentation, User Manuals and Non-functional requirements.
Software Engineer, Education Management System, IUCTT, Kuala Lumpur, Malaysia. Jan 2003 to Dec 2009
Involved in microservices API’s design, development, testing and implementation of the process
systems, working on iterative life cycles business requirements, and creating Detail Design Document
Web browser compatibility unit testing: Hyperlinks, Banners, Buttons, Static and Dynamics texts and
objects, internet updates, menus and boxes. Administration of user accounts, Active Directory,
directory structure, network shares and user policies.
Participated in Code Reviews of other modules, documents and test cases
Involved in Bug fixing of various modules that were raised by the testing teams in the application
during the Integration testing phase. Involved in end-to-end API platform setup and implementation.
Involved in Technical & Production Support.
Software Engineer, Frankbaines-Saddlery, Netsoft Microsystems, Bangalore, India. Feb 2002 to Dec 2002
Used N-tier architecture for the Business layer, Data Access Layer and Presentation Layer.
Working intensely on the designing of user interface.
Developing Web-Forms.
Written automated process using Java Scrip and VB Script
Working on XML-Schemas and Cascading Style Sheets.
Creating XSD, XSL files for Media Metadata XML files.
EDUCATION/CERTIFICATION: