Home

Misbah - AWS python developer
[email protected]
Location: Alvord, Texas, USA
Relocation: Yes
Visa: GC
Resume file: Misbah_Python AWS Developer Resume M_1766183441126.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
SUMMARY:
Having 10+ years of experience as a Senior Full stack Python Developer specializing in designing, developing, testing, and deploying scalable enterprise applications using Python, Django, and Flask.
Experience in architecting microservices-based solutions with containerization using Docker and orchestration with Kubernetes.
Integrated Java-based REST APIs with frontend interfaces and Python-based services, ensuring seamless interoperability.
Utilized Spring MVC and Hibernate for building robust, scalable, and secure web applications.
Strong hands-on experience with cloud computing platforms, including AWS (EC2, S3, Lambda, RDS,
DynamoDB, API Gateway) and Azure services.
Developed and maintained RESTful APIs using Django REST Framework and Fast API, ensuring high performance and security.
Developed high-performance APIs using FastAPI with asynchronous support to ensure low latency and scalability.
Implemented real-time data processing solutions using Kafka, RabbitMQ, and Redis for event-driven architectures.
Extensive experience in front-end development using React.js, Redux, Angular, TypeScript, and Bootstrap for dynamic and responsive UI development.
Hands-on experience in managing MongoDB and PostgreSQL databases with performance tuning and indexing strategies.
Built and managed secure communication pipelines between edge devices and AWS IoT Core using MQTT
and HTTPS.
Automated CI/CD pipelines using Jenkins, GitHub Actions, and AWS Code Pipeline for seamless integration and deployment.
Hands-on experience in developing AI/ML model APIs and integrating them into production Python
web services.
Expertise in writing efficient ETL pipelines and data transformations using Python, Pandas, Pyspark,
and Apache Airflow.
Worked with Terraform and CloudFormation for Infrastructure as Code (IaC) to automate cloud resource provisioning.
Developed and maintained data pipelines using Apache Airflow, orchestrating complex ETL workflows for seamless data ingestion and transformation.
Implemented OAuth 2.0, JWT, and other authentication/authorization mechanisms to secure web applications and APIs.
Hands-on experience with serverless computing and deploying functions using AWS Lambda and Azure
Functions.
Experience implementing caching using Redis/Memcached to optimize performance of backend services.
Built real-time applications using Web Sockets with Django Channels and Node.js-based socket
communication.
Experience with GraphQL APIs using Graphene and integrated with PostgreSQL databases.
Experience in Agile development methodologies, including Scrum and Kanban, collaborating with cross-functional teams.
Designed and developed GraphQL APIs using Python (Graphene, Strawberry) for flexible data querying.
Strong background in performance tuning, debugging, and troubleshooting production issues in large-scale applications.
Wrote unit, integration, and end-to-end tests using Pytest, Unit test, Selenium, and Cypress for quality assurance.
Experience with logging and monitoring tools like ELK Stack, Prometheus, and Grafana for application health tracking.
Skilled in using Git for version control and collaborating through GitHub, GitLab, and Bitbucket.
Excellent problem-solving skills and ability to optimize algorithms for computational efficiency and scalability.

EDUCATION:
Bachelor of Engineering in Computer Science from Anna University, 2014, India

CERTIFICATION:
AWS Certified Developer Associate
SKILLSET:

Programming Languages: Python, Ruby, Java/J2EE, JavaScript, Bash/Shell
Databases: SQL, MySQL, Oracle, MongoDB, DynamoDB, Cassandra, NoSQL, PostgreSQL.
Python Frameworks: Django, Flask, Fast API, Pyramid
Web Technologies: HTML5, CSS3, XML, Bootstrap3, AJAX, Dom, Spring boot, jQuery, Angular.
Python Libraries: NumPy, Pandas, Matplotlib, SciPy, TensorFlow
IDE Tools: Visual Studio code, IntelliJ, Eclipse, PyCharm, Xcode.
Container Tools & Technologies: Docker, Kubernetes, REST API, Ansible, Terraform
Build CI/CD Tools & Testing: Ant, Maven, Jenkins, Docker hub, GitHub Ops, GitHub Actions.
Unittest, Selenium, Pytest.
Monitoring Tools: Splunk, Cloud Watch, CloudTrail, Datadog, Grafana.
Cloud: AWS (S3, EC2, Lambda Functions, Step functions, RDS, Elastic Beanstalk, IAM, Athena, DynamoDB, EMR) Google cloud platform, Azure Cloud.


WORK EXPERIENCE:

Client: AXA XL Insurance, Stamford, CT Sep 2022- Till Date
Role: Sr. Python AWS Developer Responsibilities:
Configured URL patterns and RESTful API endpoints within Django Rest Framework (DRF) to ensure consistent and predictable access to reinsurance data.
Implemented robust data validation within DRF serializers and during import processes using CSV, XML, and JSON schema validators, guaranteeing data integrity.
Created RESTful web services using Fast API to send and receive data from PostgreSQL, Dynamo DB, and S3 buckets.
Developed and maintained Python-based REST APIs using Flask and Django, backed by PostgreSQL databases for structured data storage.
Designed scalable relational database schemas in PostgreSQL, applying normalization and indexing strategies to improve query performance.
Wrote advanced SQL queries, views, stored procedures, and triggers in PostgreSQL to support complex reporting and data analytics.
Developed RESTful APIs using Fast API for high-performance backend services.
Supported flexible data import/export (CSV, XML, JSON) and utilized Pandas & NumPy for data cleansing, transformation, and advanced analytics, facilitating efficient data exchange and analysis.
Implemented multithreading in data processing tasks and developed AWS Lambda functions for asynchronous tasks like data processing and notifications, optimizing system performance and scalability.
Configured AWS API Gateway as the entry point for external systems, enabling seamless interaction with reinsurance APIs.
Developed automated Python scripts using Boto3 to upload, download, and manage files in Amazon S3, supporting real-time data pipelines and archival processes.
Integrated S3 with Flask-based web services for dynamic file handling and secure document storage with pre-signed URL access.
Built and documented API endpoints using Fast API's automatic Swagger UI integration.
Implemented lifecycle policies in S3 buckets using Python to manage log archival, reduce storage costs, and ensure data retention compliance.
Utilized AWS S3 for secure and scalable storage of reinsurance documents and DynamoDB for efficient metadata storage, improving retrieval of treaty details.
Employed SQL queries and stored procedures for data retrieval from various sources. Monitored system performance using AWS CloudWatch, proactively identifying and resolving errors.
Developed and maintained data pipelines using S3 and AWS RDS for ingesting and managing data from various sources, ensuring data integrity and reliability.
Integrated Pytest into the build pipeline for automated testing and code validation, promoting code quality.
Utilized Git and GitHub for version control and collaboration on the Reinsurance Module.
I participated in issue tracking, task assignments, and resolution using Jira and SharePoint, fostering a collaborative development environment.

Environment: Python 3.9, Django 4X, AWS, EC2, EBS, S3, RDS, VPC, Lambda, DynamoDB, PyCharm, HTML, JSON, Flask, MongoDB, Jenkins, Docker, GIT, My SQL, Unix, PostgreSQL.
Client: Centene, St. Louis MO Feb 2020- Aug 2022 Role: Sr. Python AWS Developer
Responsibilities:
Automated routine tasks and workflows using Bash, Perl, and Python scripting, saving significant manual effort.
Created Business Logic in Python to develop robust Planning and Tracking systems for operational management.
Built Bitwage Admin, Client, and Payment Apps with a ReactJS frontend and an AWS Serverless backend (Python and Node.js), ensuring efficient deployment and scalability.
Designed and managed a comprehensive data management system using MySQL, optimizing database performance and reliability.
Developed robust CI/CD pipelines using Jenkins, Ansible Playbooks, and Ansible Tower, automating build and deployment processes.
Designed, built, and managed an ELK stack (Elasticsearch, Logstash, Kibana) for centralized logging, search functionality, and analytics.
Created real-time dashboards for executives using Logstash, Graphite, Elasticsearch, Kibana, and Redis for actionable insights.
Optimized Lang Chain prompt chaining and memory management for performance and latency across large-scale data workflows.
Collaborated with ML engineers and data scientists to integrate fine-tuned LLMs into production environments with Lang Chain wrappers.
Leveraged Lang Chain agents and tools to create intelligent assistants capable of API calling, search, and memory retrieval.
Researched and implemented cutting-edge advancements in Lang Chain and LangGraph to enhance the capabilities of AI-powered platforms.
Documented system design, data flow, and component architecture, enabling onboarding and collaboration across cross-functional teams.
Developed and tested interactive features for dashboards with Django, CSS, JavaScript, and Bootstrap, improving user experience.
Leveraged Docker for containerization, including snapshots, container management, and virtualized image deployments using Docker files.
Integrated AWS services like EC2, S3, Auto Scaling, CloudWatch, and SNS to build and manage cloud-native applications.
Developed and tested Python APIs for debugging, analyzing processor array structures at failure points.
Worked with PySpark for developing data transformation programs, creating data frames, and managing large-scale data on HDFS.
Created automation scripts for testing controllers in CI/CD environments using Python, Java, Bash scripts, and Linux command-line tools.
Converted XML to XSLT for seamless third-party application data exchanges and integrated them into business workflows.
Used Beautiful Soup, NumPy, and Panda s libraries for data extraction, manipulation, and analysis.
Managed network devices such as routers, switches, and wireless access points, ensuring high availability and performance.
Implemented code coverage tools like SonarQube and unit test plugins (JUnit, Find Bugs, Check Style) with Maven and Hudson for quality assurance.
Worked with NoSQL databases and wrote optimized Stored Procedures for data normalization and renormalization.
Developed custom Salesforce applications, performed data mapping between Salesforce and legacy CRM systems, and set up applications as per organizational needs.
rote scripts to automate testing of controllers, debugging, and data processing pipelines using Python.

Environment: Python 3X, Flask, AWS, Pyramid, Redis, Django, Docker, REST, GitHub, LINUX, NumPy, Node.JS, AJAX, ReactJS, Angular2, Azure DevOps.

Client: Global Pay, Atlanta, GA Dec 2016- Jan 2020
Role: Python Developer
Responsibilities:
Involved with Cloud Engineers to analyze and design all aspects of AWS environments and topologies. Assessed automation opportunities from an architectural perspective and prioritize proposed solutions.
Setup and build AWS infrastructure various resources, VPC EC2, S3, IAM, EBS, Security Group, Auto Scaling, and RDS in CloudFormation JSON templates.
Built Database models, views and APIs using Python for interactive web-based solutions.
Involved in Web-services backend development using Python (CherryPy, Django, SQL Alchemy).
Used Pandas library for statistical analysis. Worked on Python Open stack API.
Used PyUnit the Python unit test framework, for all Python applications.
Wrote Python modules to view and connect the Apache Cassandra instance.
Setup and manage windows servers on AWS platform using IAM, Security Groups, EC2, EBS, RDS.
Setup databases on Amazon RDS and EC2 Instances as per the requirement.
Create and manage S3 buckets and store database logs and backups.
Responsible for configuration and maintenance of Amazon Virtual Private Cloud (VPC) resources (e.g., subnets security groups, permissions policies, etc.) and make connections between different zones, blocking suspicious IP/subnets via ACL.
Implemented AWS security, privacy, performance, and monitoring solutions, including automated responses.
Responsible for Importing data from DynamoDB to Redshift in Batches using Amazon Batch using TWS scheduler and built CI/CD pipeline using Jenkins.
Build application and database servers using AWS EC2 and RDS for Oracle DB.
Used AWS Command Line Interface (CLI) to automate backup to S3 buckets, file transfers to and from Amazon S3 and created nightly AMIs for production servers as backup.
Created alarms, monitors and notifications for EC2 hosts in CloudWatch.
Created S3 buckets in the AWS environment to store files, sometimes which are required to serve static content for a web application.
Developed the required XML Schema documents and implemented the framework for parsing XML documents.
Extensively worked on Jenkins for Continuous Integration and end-to-end automation for all build and deployments.
Developed responsive UI using HTML5/CSS3, AngularJS and JavaScript.
Used JSON based and RESTful API for information extraction.
Involved in debugging and troubleshooting issues and fixed bugs in SWIM application which is the main source of data for internal and external customers and service teams.
Acquired immense knowledge with configuration management tool Chef.
Exported/Imported data between different data sources using SQL Server Management Studio and Oracle.
Utilized Agile, DevOps and Lean best practices to efficiently and modularly develop, deploy, and operate the target solutions.
Written complex SQL queries for data validation based on ETL mapping specifications using INFORMATICA. For Oracle database.
Worked in a fast pace agile environment (2 weeks of sprint), attended sprint planning at the beginning of each sprint and retro at the end of each sprint and in the middle, we had mid-sprint review/product backlog review (PBR) meeting to go over to our backlog and prioritized User stories and estimated points.
Environment: Python2X, Git, SVN, GitHub, Lambda, DynamoDB, Redshift, EC2, IAM, S3, CloudWatch, Django 1.5, MySQL, Angular.JS, Pandas, Flash, PyUnit, Open Stack, HTML, CSS, jQuery, Jenkins, Nexus, JavaScript, Apache, Jira, Linux, Git, Cassandra, Windows, Linux, SSMS, Informatica, Oracle.

Client: Gap India Aug 2014- Nov 2016
Role: Python Developer
Responsibilities:
Developed Python based micro service to extract the data from system of records into Enterprise Data warehousing.
Managed large-scale, geographically distributed database systems, including relational (Oracle, SQL server) and NoSQL (MongoDB, Cassandra) systems.
Optimization of Hive queries using best practices and right parameters and using technologies like Python, PySpark.
Build, manage, and continuously improved the build infrastructure for global software development engineering teams including implementation of build scripts, continuous integration infrastructure and deployment tools.
Troubleshooted Production issues pertaining to AWS Cloud Resources and Application Infrastructure point of view.
Built numerous Lambda functions using python and automated the process using the event created.
Created an AWS Lambda architecture to monitor AWS S3 Buckets and triggers for processing source data.
Migrated Database from SQL Databases (Oracle and SQL Server) to NO SQL Databases (MONGODB).
Involved in the setting up Micro services using API Gateway, Lambda, DynamoDB that connects to UI.
Experience in writing Infrastructure as code in Terraform, Azure resource management, AWS Cloud formation. Created reusable Terraform modules in AWS cloud environments.
Involved in Architect, build and maintain Highly Available secure multi-zone AWS cloud infrastructure utilizing Chef with AWS Cloud Formation and Jenkins for continuous integration.
Installed, Configured and automated the Jenkins Build jobs for Continuous Integration and AWS Deployment pipelines using various plug-ins like Jenkins EC2 plug-in and Jenkins Cloud Formation plug-in.
Setup and Implement Continuous Integration and Continuous Delivery (CI & CD) Process stack using AWS, GITHUB/GIT, and Jenkins.
Worked on tools such as Kubernetes with Docker to assist with auto-scaling, continuous integration, rolling updates with no downtime.
Maintaining a farm of AWS solutions using EC2 instances, ELB's, S3, EBS, Auto Scaling and RDS. Setting up servers through using AWS for deployment or other uses for application.
Designed roles and groups for users and resources using AWS Identity Access Management (IAM) and managed network Security using Security Groups, and IAM.
Used version controlling systems like GIT and SVN.
Building/Maintaining Docker container clusters managed by Kubernetes, Linux, Bash, GIT, Docker, on GCP. Utilized Kubernetes and Docker for the runtime environment of the CI/CD system to build, test deploy.
Loaded the data into Spark RDD and do in memory data Computation to generate the Output response.
Configured and maintained Jenkins to implement the CI/CD process and integrated the tool with Ant and Maven to schedule the builds.
Developed Templates for AWS infrastructure as a code using Terraform to build staging and production environments.
Involved in infrastructure as code, execution plans, resource graph and change automation using Terraform.
Managed AWS infrastructure as code using Terraform.
Managed Amazon Web Services (AWS) infrastructure with automation and configuration management tool such as Chef. Designing cloud hosted solutions, specific AWS product suite experience.
Created S3 buckets and managed policies and Utilized S3 bucket and Glacier for storage and backup on AWS.

Environment: Python, AWS EC-2, S3, ELB, SVN, Clear Case, Maven, ANT, Gradle, Jenkins, GIT, Chef, Kubernetes Web Sphere, Jira, SDLC, chef, Docker, Nagios, Shell Scripts, Unix/ Linux environment, python, Spark, Spark API, Spark SQL.
Keywords: continuous integration continuous deployment artificial intelligence machine learning user interface javascript sthree database golang Connecticut Georgia Missouri

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];6552
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: