| Ashish Roy Abel - Senior AI/ML Developer |
| [email protected] |
| Location: Austin, Texas, USA |
| Relocation: |
| Visa: H1B |
| Resume file: Ashish- AI_ML_Engineer_Resume_2026_1775154606752.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
Professional Summary:
Results-driven AI/ML Engineer with around 9 years of experience designing, developing, and deploying scalable machine learning, deep learning, and Generative AI and Agentic AI solutions across enterprise environments. Designed and implemented end-to-end machine learning pipelines for predictive analytics, classification, clustering, and anomaly detection using Python, Scikit-learn, and TensorFlow. Built scalable deep learning models using PyTorch and Keras for complex prediction tasks including pattern recognition and behavioral analytics. Developed Natural Language Processing (NLP) pipelines for text classification, sentiment analysis, topic modeling, and document intelligence. Implemented Generative AI applications using Large Language Models (LLMs) to automate document summarization, intelligent chatbots, and knowledge retrieval systems. Designed and deployed Retrieval-Augmented Generation (RAG) architectures using Lang Chain, LlamaIndex, and vector databases for enterprise knowledge search platforms. Applied advanced prompt engineering techniques to optimize LLM responses and improve contextual accuracy in AI-driven applications. Fine-tuned pretrained LLM models using Hugging Face frameworks to build domain-specific AI assistants and conversational agents. Built semantic search systems using vector embeddings and vector databases to enable intelligent information retrieval across enterprise datasets. Developed scalable data preprocessing and feature engineering pipelines using Pandas, NumPy, and Python-based data processing frameworks. Conducted exploratory data analysis (EDA) and statistical modeling to identify data patterns and improve predictive model performance. Implemented model optimization techniques including hyperparameter tuning, cross-validation, and ensemble learning to improve ML accuracy and reliability. Designed and built real-time anomaly detection systems for identifying abnormal patterns in operational and transactional datasets. Built computer vision models for image classification, object detection, and automated visual inspection systems. Developed high-performance data pipelines and ETL workflows using Spark, Hadoop, and distributed computing frameworks. Implemented stream processing solutions for real-time analytics and model inference pipelines. Developed RESTful APIs and microservices using Python and Java to expose machine learning models to enterprise applications. Containerized AI/ML applications using Docker and orchestrated deployments with Kubernetes for scalable production environments. Designed and implemented CI/CD pipelines for ML workflows, automating model training, testing, validation, and deployment. Built enterprise MLOps frameworks for experiment tracking, model versioning, and lifecycle management. Implemented model monitoring solutions to detect model drift, data drift, and performance degradation in production systems. Deployed scalable AI models on AWS SageMaker, EC2, S3, enabling distributed training and high-performance inference. Built enterprise ML solutions on Microsoft Azure Machine Learning and Azure cloud platforms. Developed and deployed ML workloads using Google Cloud Vertex AI for scalable model training and production deployment. Integrated NoSQL databases and SQL-based systems to support high-volume data storage and retrieval for ML workflows. Designed data visualization dashboards using Matplotlib and Seaborn to communicate model insights and performance metrics. Developed AI-powered recommendation systems using collaborative filtering and deep learning approaches. Built scalable data pipelines and ML workflows using distributed big data technologies including Spark and Hadoop. Managed source code repositories and collaborative development using Git, GitHub, and Jupyter Notebook environments. Implemented Linux-based ML environments for scalable development, automation, and deployment of machine learning applications. Collaborated with cross-functional teams including data engineers, software developers, and business stakeholders to deliver enterprise AI solutions aligned with business goals. Technical Skills: Core AI / ML Skills Machine Learning, Deep Learning, Natural Language Processing (NLP), Agentic AI,Computer Vision, Generative AI, Large Language Models (LLMs), Feature Engineering, Model Evaluation & Optimization Programming Languages Python, R, SQL, Java, Scala AI / ML Frameworks & Libraries TensorFlow, PyTorch, Scikit-learn, Keras, Hugging Face Data Analysis & Visualization Pandas, NumPy, Matplotlib, Seaborn Generative AI & LLM Technologies Prompt Engineering, LLM Fine-tuning, Retrieval-Augmented Generation (RAG), Vector Databases, LangChain, LlamaIndex MLOps & Model Lifecycle Model Optimization, Model Deployment, CI/CD for ML, Model Monitoring, Experiment Tracking Data Engineering Data Preprocessing, ETL Pipelines, Data Pipelines, Spark, Hadoop Cloud Platforms AWS (SageMaker, EC2, S3), Azure (Azure ML), Google Cloud (Vertex AI) DevOps & Containerization Docker, Kubernetes Databases SQL, NoSQL Databases, Vector Databases Tools & Development Environment Git, GitHub, Jupyter Notebook, REST APIs, Linux Professional Experience: Client: Comerica Bank- Dallas, TX Feb 2025 Till Date Role: Sr. AI/ML Engineer Responsibilities Designed and deployed Machine Learning and Generative AI models for fraud detection, credit risk scoring, customer segmentation, and financial forecasting, improving decision accuracy and customer engagement. Built scalable data pipelines using Spark, Hadoop, and Databricks to process large-scale financial datasets including transactions and customer behavior. Developed advanced fraud detection frameworks leveraging transaction history, geolocation, and behavioral analytics to identify suspicious activities. Engineered real-time data ingestion pipelines using Kafka to process millions of financial transactions for low-latency fraud detection. Implemented causal inference models to identify drivers of customer churn, loan defaults, and campaign effectiveness. Developed time-series forecasting models (LSTM, statistical methods) to predict cash flow, loan demand, and transaction volumes. Built NLP-driven solutions to analyze customer feedback, support tickets, and call transcripts for sentiment analysis and insights. Applied Generative AI and LLMs for automated customer communication, financial report summarization, and intelligent knowledge search. Designed and implemented Agentic AI systems to automate loan processing, fraud monitoring, and financial decision workflows. Built Scikit-learn models for fraud detection, credit scoring, and customer lifetime value prediction. Optimized Random Forest and Gradient Boosting models on distributed Spark clusters, improving prediction accuracy and reducing training time. Developed scalable distributed ML pipelines using Spark, Python, Scala, and MapReduce for large-scale model training and deployment. Performed feature engineering and EDA using Pandas, NumPy, and SciPy to enhance model performance. Engineered robust ETL workflows for structured and unstructured financial data to support ML pipelines. Built and deployed REST APIs and microservices for real-time scoring of fraud detection and credit risk models. Integrated ML models with core banking systems and digital payment platforms for real-time decisioning. Implemented LLM-powered virtual assistants with vector search to enhance customer support and internal operations. Delivered interactive dashboards using Power BI and Tableau to monitor fraud KPIs, loan performance, and risk metrics. Established MLOps frameworks using MLflow, Airflow, Docker, and Kubernetes for model deployment, monitoring, and versioning. Designed CI/CD pipelines and standardized ML workflows to accelerate model release cycles and improve reliability. Implemented model monitoring systems to track drift, latency, and prediction accuracy for continuous improvement. Conducted A/B testing and validation frameworks to evaluate model performance and business impact. Ensured Responsible AI practices, including fairness, explainability, and compliance with financial regulations. Collaborated with risk, compliance, product, and engineering teams to integrate ML solutions into banking workflows. Deployed scalable ML solutions on cloud platforms, enabling enterprise-grade analytics and intelligent automation. Environments: Python, R, Spark, Scala, Hadoop, Kafka, Databricks, Azure ML, AWS SageMaker, MLflow, Airflow, TensorFlow, PyTorch, Keras, Hugging Face, Power BI, Tableau, SQL Server, REST APIs, Docker, Kubernetes, MapReduce, Pandas, NumPy, SciPy Client: Sasken Technologies- India Apr 2021 Dec 2023 Role: Sr Data Scientist Responsibilities Designed ML models for auto damage image recognition, integrating CNN-based CV models that reduced claim processing time by 30 percent. Built fraud detection algorithms using ensemble models (XGBoost, Random Forests) on structured/unstructured claim data, preventing improper payouts. Developed scalable ETL pipelines on Databricks, Spark, and Hadoop for claim ingestion from insurers, body shops, and repair centers. Applied Generative AI and LLMs for automated summarization of accident descriptions, repair notes, and adjuster feedback, improving case turnaround. Delivered real-time dashboards in Tableau and Power BI to track claim settlement SLAs, fraud alerts, and adjuster productivity. Deployed MLOps pipelines with MLflow, Docker, and Kubernetes, reducing model deployment cycles for production AI solutions. Created severity scoring models for workers compensation claims, enabling insurers to predict treatment costs and recovery timelines more accurately. Integrated computer vision pipelines for vehicle part recognition, linking directly with repair databases for cost estimation automation. Collaborated with insurance adjusters and underwriters to align ML model outputs with real-world claims decision-making processes. Implemented graph-based ML models to detect suspicious provider and claimant networks, identifying collusion in fraud rings. Used time-series forecasting for claim volume prediction, supporting workforce planning and SLA compliance across insurers. Built real-time Kafka pipelines to process high-frequency insurance claims, enabling near-instant model scoring for fraud and severity. Enhanced document processing workflows with NLP models, extracting data from accident reports, repair invoices, and legal documents. Applied explainable AI (XAI) techniques to ensure transparency in auto damage predictions, addressing regulator and customer trust concerns. Developed semantic search tools with vector databases for adjusters to retrieve prior similar claims, improving decision speed. Conducted A/B testing on automated claim settlement models, demonstrating measurable reductions in claim closure times. Migrated legacy claims AI systems to AWS SageMaker and Azure ML, reducing infrastructure costs while increasing reliability. Implemented bias detection frameworks in workers compensation severity scoring models to ensure fair recommendations. Trained CNN and Transformer-based vision models on multi-million image datasets for auto part classification and repair estimation. Delivered executive presentations highlighting ROI from AI-driven claims automation, securing investments in AI initiatives. Environment: Python, AWS SageMaker, TensorFlow, PyTorch, OpenCV, Kafka, Lambda, API Gateway, SQL, Tableau, CloudWatch, Docker, Jenkins Client: Dell Technologies- India Feb 2020 Mar 2021 Role: Data Scientist Responsibilities: Analyzed large-scale operational and user behavior datasets to identify patterns, trends, and insights that support business decision-making. Developed statistical and predictive models using Python and R to evaluate customer behavior, product performance, and operational metrics. Built machine learning models using Scikit-learn for anomaly detection to identify unusual patterns in system usage and business transactions. Applied Natural Language Processing (NLP) techniques to analyze unstructured text data such as customer feedback, support tickets, and product reviews to extract actionable insights. Designed and implemented predictive analytics models to forecast customer behavior, demand patterns, and operational performance. Applied A/B testing and statistical experimentation to evaluate the effectiveness of product features, digital campaigns, and user engagement strategies. Conducted time-series analysis on transactional and usage data to detect trends, seasonal patterns, and anomalies. Performed data preprocessing, feature engineering, and model tuning to improve machine learning model accuracy and performance. Optimized large-scale data processing workflows using SQL and bigdata technologies to improve analytics efficiency and scalability. Contributed to the development of Generative AI solutions for automated document analysis, knowledge extraction, and intelligent search systems. Migrated analytics workloads to cloud-based environments using Microsoft Azure data platforms to support scalable data processing and machine learning pipelines. Developed ML-driven analytical tools to improve customer engagement and deliver personalized recommendations based on user behavior and data insights. Created detailed documentation and reproducible workflows to ensure transparency, scalability, and maintainability of data science models and pipelines. Presented analytical insights and executive-level reports demonstrating the impact of data science initiatives on customer experience, operational efficiency, and business growth. Environment: Python, R, SQL Server, Azure Data Factory, Pandas, NumPy, Scikit-learn, Time-Series Analysis Tools, IoT Data Streams, Power BI, Tableau. Client: Corsel Technologies- India Oct 2018 Feb 2020 Role: Data Scientist Responsibilities Designed and deployed ML models for financial risk modeling, fraud detection, and customer segmentation, enabling data-driven decisions. Built scalable ETL and data pipelines using Spark, Hadoop, and Databricks, improving ingestion speed for structured and unstructured financial datasets. Applied NLP and LLMs for analyzing regulatory filings, contracts, and financial reports, automating compliance workflows for clients in regulated industries. Developed recommendation systems for retail banking clients, improving customer engagement through personalized product offerings. Engineered real-time anomaly detection solutions with Kafka and ML models, helping clients monitor transactions for suspicious activity. Partnered with business analysts, engineers, and compliance officers to align ML solutions with regulatory frameworks and business KPIs. Implemented cloud-native ML models on AWS and Azure, supporting scalable deployments for multiple clients use cases. Enhanced customer churn models using advanced feature engineering and time-series analysis, improving retention strategies for telecom and financial services clients. Designed neural network architectures using Keras including CNNs, RNNs, and LSTM models for advanced data modeling and time-series forecasting. Conducted A/B testing and hypothesis validation on deployed ML models, ensuring robustness and reliability before full-scale rollouts. Applied explainable AI (XAI) for model transparency, enabling clients to meet governance and audit requirements. Delivered predictive maintenance solutions for manufacturing clients using sensor data and ML models, improving operational uptime. Implemented graph-based ML solutions to detect hidden relationships in customer networks, supporting fraud prevention strategies. Built interactive dashboards in Tableau and Power BI for real-time monitoring of KPIs, fraud alerts, and predictive risk scores. Researched and applied deep learning techniques such as CNNs and RNNs to expand ML applications across finance and telecom. Integrated MLOps practices with CI/CD pipelines, reducing deployment cycles and increasing production readiness of ML systems. Collaborated on GenAI-driven knowledge systems for financial document summarization, enhancing efficiency in compliance departments. Trained and mentored junior data scientists on best practices in ML pipelines, data governance, and model validation techniques. Applied fairness and bias detection frameworks to financial ML models, ensuring ethical and responsible use of AI. Developed semantic search applications with vector databases, enabling clients to quickly retrieve critical financial insights. Produced executive presentations that demonstrated ROI from AI initiatives, influencing client adoption of advanced analytics solutions. Environment: Python, R, Spark, Hive, Azure ML, Power BI, SQL Server, Scikit-learn, Pandas, NumPy. Client: Techasoft- India Jun 2016 Oct 2018 Role: Software Developer Responsibilities: Participated in the complete SDLC process and used PHP to develop website functionality. Designed and developed the UI of the website using HTML, XHTML, AJAX, CSS, and JavaScript. Developed entire frontend and backend modules using Python on Django Web Framework. Designed and developed data management system using MySQL. Built application logic using Python. Provided GUI utilizing PyQt for the end user to create, modify and view reports based on client data. Angular.js is used to build efficient backend for client web application. Expertise in Service Oriented Architecture (SOA) and its related technologies like Web Services, BPEL, WSDLs, SOAP, XML, XSD, XSLT etc. Participated in requirement gathering and worked closely with the architect in designing and modeling. Worked on development of SQL and stored procedures on MYSQL. Developed shopping cart for Library and integrated web services to access the payment (E-commerce) Implemented monitoring and established best practices around using elastic search. Used GIT for the version control. Environment: Python 2.7, Django, PHP, HTML, XHTML, AJAX, CSS, JavaScript, AngularJS, PyQt, MySQL, SQL, Stored Procedures, Service-Oriented Architecture (SOA), Web Services, BPEL, WSDL, SOAP 1.1, XML, XSD, XSLT, Elasticsearch, Git. Education And Certification Details: Master s in Business Analytics Florida Atlantic University USA Bachelor s in Computer Science Bangalore University India AWS Data Engineer AWS Machine Learning Associate AWS Machine Learning Specialty AWS Solutions Architect AWS DevOps Professional Azure AZ-900 GCP Cloud Digital Leader Graph Developer Professional and Associate Keywords: continuous integration continuous deployment artificial intelligence machine learning user interface javascript business intelligence sthree rlang Arizona Texas |