Home

Yashwanth - AI Engineer
[email protected]
Location: Houston, Texas, USA
Relocation:
Visa: H1
Resume file: Yashwanth_R_AI Data_1772488480602.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
DATA & GENERATIVE AI ENGINEER

YASHWANTH R
Email: [email protected]
Cell: 469-294-1893
SUMMARY
AI engineer with overall 10 years across banking, retail, and enterprise platforms, progressing from backend and financial data services into LLM-based analytics and automation. Experience integrating AI features into existing financial workflows, building retrieval-based assistants over structured and historical data, and stabilizing pipelines that feed analytics and insight systems. Strong foundation in transaction data, reporting, and user-facing financial summaries from prior banking and enterprise application work.
SKILLS
GenAI & LLM Engineering: Retrieval-Augmented Generation (RAG), Embedding Pipelines, Vector Indexing, Semantic Search, Prompt Engineering, Prompt Modularization, Context Window Optimization, Retrieval Filtering, Structured Output Formatting, Response Validation, Hallucination Mitigation Techniques, LLM Integration Patterns, AI Pipeline Orchestration
AI Architecture & Reliability: LLM Lifecycle Management, Grounded Response Design, Observability for AI Systems, AI Output Monitoring, Logging & Traceability, Secure AI Deployment, Compliance-Aware AI Systems, Data Lineage Tracking, Model Behavior Monitoring
Data Engineering for AI: Data Standardization, Feature Preparation for Embeddings, ETL/ELT Workflows, Data Curation for AI Consumption, Schema Normalization, Batch & Streaming Data Processing, Data Quality Validation, Metadata Management
Cloud & Deployment: AWS Services (Lambda, S3, EC2), API Deployment, Microservices Architecture, RESTful Services, CI/CD Pipelines, Containerized Deployment Patterns, Production Support & Monitoring
PROFILE HIGHLIGHTS
Building LLM/RAG systems using vector databases across banking, retail, and enterprise platforms with focus on production integration into existing analytics workflows.
Transform structured transaction and operational data from multi-source systems into AI-powered summaries and dashboards grounded in business metrics.
Design retrieval contexts and prompt templates using few-shot learning and chain-of-thought reasoning to map business questions into data-backed outputs aligned with finance team interpretations.
Build data pipelines transforming unstructured documents into structured formats, implement monitoring frameworks for LLM quality, and deploy models to production using cloud services and CI/CD.
Built and deployed production-ready RAG-based GenAI systems integrated into financial data platforms.
Experience in LLM evaluation, hallucination mitigation, and response validation pipelines.
Strong background in PII masking, compliance-aware AI systems, and audit logging
Work in regulated environments applying masking and validation patterns to stabilize enterprise pipelines while meeting banking compliance standards.
Bridge backend systems, analytics layers, and AI interfaces working with cross-functional teams to deliver explainable, production-ready features.
Stabilize enterprise data services and pipelines ensuring downstream AI features receive consistent, well-structured inputs for analytics workflows.
Integrate AI summaries into existing dashboards and insight interfaces without disrupting established user workflows or business processes.
Map business questions into data-backed prompts and retrieval flows aligned with how finance and operations teams interpret metrics and reporting definitions.

PROFESSIONAL EXPERIENCE

Citi | Generative AI Engineer | Jan 2023 Present
Built an internal assistant that generates spending and transaction summaries from account activity data used in financial insight features.
Designed prompt templates that convert structured transaction and account fields into readable financial explanations of trends and categories.
Added retrieval over recent transaction and account history so generated insights referenced actual activity patterns.
Integrated LLM summaries into existing financial insight interfaces alongside standard account analytics.
Standardized multi-source banking data (transactions, balances, merchant info) before passing context to the LLM workflow.
Contributed to full lifecycle implementation of AI-driven financial insight systems from data preparation to production deployment.
Designed embedding workflows and managed vector indexing processes for structured banking datasets.
Implemented retrieval orchestration logic to construct contextual inputs for LLM processing.
Developed domain-aligned prompt templates tailored to financial transaction analysis.
Applied structured response formatting to ensure consistency with enterprise reporting standards.
Built response validation layers to align AI outputs with curated financial datasets.
Established monitoring and logging mechanisms to track AI system behavior within regulated environments.
Integrated AI services into existing banking analytics and reporting platforms.
Supported secure deployment practices aligned with compliance and enterprise data governance requirements.
Used retrieval context from savings goals and allocation data to generate more personalized financial summaries.
Applied masking to sensitive account and card attributes before LLM processing.
Reduced lag between new transactions and insight generation via incremental data updates.
Validated generated insights against existing rule-based financial insight outputs to ensure consistency.
Logged common financial questions to refine prompt templates and retrieval coverage.
Improved handling of merchant, date, and amount fields in prompts to reduce summary errors.
Documented prompt patterns, retrieval scope, and data fields used in transaction insight generation.

Tools: Python, OpenAI API, LangChain, Pinecone, PostgreSQL, AWS Lambda, Docker, FastAPI, Pandas, Git

Albertsons Companies | Data Scientist | Feb 2021 Dec 2022
Worked on financial dashboards explaining revenue, margin, and category performance from sales and merchandising data.
Structured sales and transaction data into curated tables used across analytics and reporting workflows.
Built automated ingestion of periodic financial reports into centralized analytics datasets.
Developed descriptive explanations of metric changes using underlying report data.
Designed and maintained scalable data pipelines supporting analytical and AI-driven reporting systems.
Built curated financial and operational data layers optimized for structured analysis and insight generation.
Implemented schema standardization and transformation workflows for reliable downstream consumption.
Developed backend data services supporting enterprise reporting and analytics applications.
Applied data validation and consistency checks across reporting datasets.
Supported integration of analytics outputs into business-facing dashboards and services.
Integrated explanations directly into reporting dashboards alongside KPI tables and charts.
Compared current vs prior-period metrics to explain financial trends across regions and categories.
Ensured analytics outputs matched BI dashboard numbers by aligning to curated data sources.
Logged analyst questions around reports to improve coverage and clarity of explanations.
Tuned explanation wording to match finance reporting terminology and conventions.
Automated refresh of new reporting extracts so dashboards and explanations stayed current.
Documented metrics, report structures, and data lineage for analytics consumers.
Collaborated with finance users to refine how metric changes were described and interpreted.

Tools: Python, SQL, Tableau, Power BI, Snowflake, Apache Airflow, Pandas, NumPy, Azure, Jupyter, Git

Citizens Bank | Senior Software Engineer | Oct 2019 Jan 2021
Built and stabilized APIs exposing account balances and transaction history from core banking systems to digital channels.
Aggregated transaction and account data across multiple legacy banking services into consistent response formats.
Improved latency and reliability of account-summary endpoints used by mobile and online banking.
Added masking, audit fields, and trace IDs to customer-data services to meet banking compliance standards.
Resolved inconsistencies in transaction ordering and timestamps across backend sources.
Developed scalable backend services supporting enterprise financial applications.
Designed API integration layers for secure data exchange between internal systems.
Implemented structured logging and traceability mechanisms for financial transaction processing systems.
Supported performance optimization and reliability improvements across service architectures.
Implemented request tracing to follow account-data flows across services during incidents.
Supported releases affecting transaction and balance data across banking apps.
Clarified field mappings between core banking systems and downstream channels.
Documented account and transaction data definitions used by analytics and reporting teams.
Assisted internal teams using transaction data for reporting and insights features.

Tools: Java, Spring Boot, PostgreSQL, Redis, Kafka, Jenkins, Docker, Kubernetes, REST APIs, Splunk, Git

Walt Disney World | Software Engineer | Dec 2017 Sep 2019
Developed backend services supporting park reservations and guest itinerary views used by web and mobile apps.
Integrated reservation schedules with park-entry scan events so itineraries reflected real guest activity.
Aggregated booking, ticketing, and schedule data from multiple services into unified itinerary responses.
Improved performance of itinerary lookup APIs during peak park usage periods.
Helped synchronize reservation updates across booking and guest-experience platforms.
Supported mobile check-in features tied to reservation validation and guest identity.
Built performance-optimized backend services supporting high-traffic enterprise applications.
Designed scalable API architectures aligned with enterprise integration requirements.
Implemented service-level logging and monitoring mechanisms for operational visibility.
Supported system reliability initiatives across distributed application environments.
Investigated edge cases where schedule changes did not propagate correctly across systems.
Added logging and tracing around reservation updates to help ops diagnose guest-schedule issues.
Worked with product teams to align itinerary data structures with downstream experience features.
Documented service contracts and data flows across reservation and itinerary systems.

Tools: Java, Node.js, MongoDB, MySQL, AWS (S3, EC2, Lambda), React, Docker, Jenkins, REST APIs, Git
Keywords: continuous integration continuous deployment artificial intelligence javascript business intelligence sthree rlang

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];6921
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: