| Venkata - Oracle PL/SQL Developer/ETL Developer |
| [email protected] |
| Location: Warren, Illinois, USA |
| Relocation: Yes |
| Visa: H1B |
| Resume file: Venkata_Sathya_Oracle_PL SQL_ETL Developer_1775762878002.docx Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
Venkata Sathya Ramesh Kumar
Mail: [email protected] Phone: +1 (469) 699 8203 PROFESSIONAL SUMMARY Experienced Data Engineer with 18+ years of hands-on expertise in designing, developing, and maintaining large-scale, high-performance data systems and database applications. Strong proficiency in Oracle databases, Advanced PL/SQL, and Unix/Linux shell scripting. Deep understanding of logical and physical data modeling, database design principles, and best practices to ensure data integrity, scalability, performance, and security. Extensive experience working with relational databases including Oracle (19c/12c/11g), SQL Server (2008/2016/2018), and PostgreSQL (12.x/14.x) in client-server environments. Advanced expertise in PL/SQL development, including cursors, exception handling, collections, MERGE statements, views, materialized views, table functions, stored procedures, packages, triggers, object-oriented programming, dynamic SQL, external tables, and analytical functions. Strong hands-on experience with advanced PL/SQL features such as BULK COLLECT, FORALL, SAVE EXCEPTIONS, pipelined functions, and SYS_REFCURSOR for efficient data processing. Developed and implemented robust database solutions using Oracle-supplied packages including DBMS_SQL, DBMS_SCHEDULER, UTL_FILE, DBMS_XMLDOM, DBMS_UTILITY, and DBMS_LOCK. Designed and optimized complex PL/pgSQL functions, procedures, and queries using indexing, partitioning, CTEs, and EXPLAIN ANALYZE to support high-volume data workloads. Built scalable data pipelines integrating PostgreSQL, Oracle, AWS RDS, and ETL tools, improving query performance by up to 60% through optimization, parallel execution, and schema tuning. Expertise in performance tuning using query optimization, indexing strategies, SQL profiling, database partitioning, and tools such as EXPLAIN PLAN, SQL Trace, and SQL Tuning Advisor. Implemented Oracle partitioning strategies (LIST, RANGE, HASH) to enhance data loading efficiency and query performance. Skilled in identifying production bottlenecks, troubleshooting critical issues, and deploying hotfixes to ensure system stability. Strong experience in Unix/Linux environments, including Bash, PowerShell scripting, and scheduling jobs using cron. Designed and implemented ETL pipelines using shell scripting to ingest data from flat files into staging and data warehouse tables. Developed automated workflows by integrating shell scripts with PL/SQL procedures for data validation and loading into fact tables. Experience building and managing CI/CD pipelines using Jenkins and other DevOps tools to automate deployment and improve development efficiency. Proficient in testing methodologies including unit testing, integration testing, performance testing, regression testing, and providing UAT support with test automation. Skilled in designing system documentation and diagrams such as Data Flow Diagrams (DFD), Use Case Diagrams, ERDs, and process flow diagrams. Developed reporting and data visualization solutions using SSRS, Power BI, and Splash BI. Hands-on experience with SQL*Loader and external tables for efficient ingestion of large datasets from flat files. Strong background in data analysis, data profiling, and documentation including source-to-target mappings, BRDs, and FRDs. Analyzed large-scale datasets using optimized SQL queries to identify patterns, performance issues, and data quality gaps. Extensive experience in ETL development using Informatica PowerCenter, Oracle Data Integrator (ODI), SSIS, and legacy data warehouse systems. Implemented data integration solutions using SSIS to automate workflows and support enterprise data warehousing. Experience in database migration projects, including migrating SQL Server databases to Oracle. Proficient with version control systems such as GitHub and Azure DevOps; experienced with workflow orchestration tools like Control-M and Apache Airflow. Hands-on experience with AWS cloud services, including EC2 and RDS, for deploying and managing data infrastructure. Developed data extraction and processing solutions using Python; also experienced in application development using Core Java and J2EE. Familiar with Agile methodologies (Kanban) and tools like JIRA for project tracking and collaboration. Actively contributed across the software development lifecycle, including maintenance, release management, and critical bug fixes to ensure reliable delivery. Leveraged tools like GitHub Copilot to accelerate development in PL/SQL, Python, and ETL processes, improving productivity and code quality. EDUCATIONAL DETAILS: B.V.SC & A.H from Acharya N.G Ranga Agricultural University 2002 TECHNICAL SKILLS Oracle 19c Exa Data/12c/11g/10G/9i/8i/8, SQL, PL/SQL, SQL*PLUS, Performance Tuning and Optimization Oracle 19c, 12c, 11g, 10g, 9i Oracle Utilities SQL*Loader (SQLLDR), SQL Developer, TOAD 18 Postgres (12.x/14.x), Performance Tuning and Optimization 4 AWS RDS, S3, Git, GitHub, DevOps pipelines, Jenkins, CI/CD 4 ETL Tools (Informatica 9x,10x, ODI) 10 UNIX Shell Scripting, Aix Cron Scheduling 4 Data Modeling using Erwin 8x,9x,10x 6 SQL Server 2005,2008,2016,2018,2019 6 Reporting Tools: Power BI, Splash BI 6 REST API, JSON, XML, Micro Services, SSIS Packages 4 VSS, DevOps, Git (Version Controlling Tools) 10 Database Testing, Unit Testing, Integration testing, User Acceptance Testing 6 Oracle Forms 6i/9i, Oracle Reports 6i/9i 6 PROFESSIONAL EXPERIENCE LPL Finance , Chicago,IL Oct 2025 Till date Lead Data Engineer/ETL Responsibilities Engineered and optimized high-performance PL/SQL packages, functions, and materialized views on Oracle 19c Exadata, leveraging partitioning, parallel execution, and AWR/ASH diagnostics to improve throughput and significantly reduce ETL runtimes. Documented enterprise architecture by creating business process flows, data lineage diagrams, and application architecture documents using Microsoft Visio, supporting design reviews, audits, and compliance initiatives. Developed advanced PL/SQL frameworks utilizing BULK COLLECT, FORALL, pipelined functions, dynamic SQL, and autonomous transactions to efficiently process high-volume data transformations. Designed and secured REST and SOAP-based Oracle APIs to enable real-time data exchange across internal applications, third-party systems, and enterprise reporting platforms. Performed advanced SQL and PL/SQL performance tuning using AWR/ASH reports, TKPROF, SQL Profiles, indexing strategies, partition pruning, and optimizer hints successfully reducing query execution time and improving overall system performance. Built scalable ETL workflows in Informatica PowerCenter 10.x using reusable mappings, mapplets, and optimized transformations, improving data load efficiency and maintainability. Utilized Python (Pandas, Requests, and custom parsing scripts) to extract, cleanse, and standardize data from APIs, flat files, and semi-structured sources for downstream ETL processing. Leveraged GitHub Copilot in enterprise environments to accelerate PL/SQL, Python, and ETL development, improving code quality and reducing development cycles. Automated enterprise batch workflows using Autosys, implementing job scheduling, dependency management, SLA monitoring, alerting, and failure recovery mechanisms. Designed enterprise data models and optimized SQL queries to support analytics and BI dashboards delivered through Power BI. Executed large-scale data migration and integration across Oracle databases, flat files, and third-party systems with robust error handling and reconciliation frameworks. Improved reporting performance by optimizing SQL queries, implementing partition strategies, and maintaining materialized views for high-performance analytical workloads. Ensured ETL accuracy and data integrity through validation rules, reconciliation processes, and comprehensive audit logging mechanisms. Configured and managed AWS RDS (Oracle), including database migration, performance tuning, backup/recovery, monitoring, and secure connectivity. Deployed and optimized AWS EC2 environments to run Python-based ETL jobs, automation scripts, and batch workloads in a scalable manner. Integrated AWS S3 for data ingestion, storage, and distribution, enabling seamless interaction with ETL pipelines and downstream analytics systems. Environment: Oracle 19c Exadata, AWS RDS, EC2, S3, Informatica 10x, Python, Toad 16, GitHub, Jira, AutoSys, Visio, Unix, Windows, XML, Excel, Visual studio 15/17, Bank of America , Chicago,IL June 2024 Sept 2025 Lead Data Engineer/ETL Responsibilities Built scalable ETL workflows in Informatica PowerCenter 10.x using reusable mappings, mapplets, and optimized transformations to enhance data load efficiency and maintainability. Engineered and optimized high-performance PL/SQL packages, functions, and materialized views on Oracle 19c Exadata, leveraging partitioning, parallel execution, and AWR/ASH diagnostics to improve throughput and significantly reduce ETL runtimes. Performed advanced SQL and PL/SQL performance tuning using AWR/ASH reports, TKPROF, SQL Profiles, indexing strategies, partition pruning, and optimizer hints, resulting in reduced query execution times and improved overall system performance. Developed advanced PL/SQL frameworks utilizing BULK COLLECT, FORALL, pipelined functions, dynamic SQL, and autonomous transactions to efficiently process high-volume data transformations. Designed enterprise data models and optimized SQL queries to support analytics and BI dashboards delivered through Power BI. Improved reporting performance by optimizing SQL queries, implementing partition strategies, and maintaining materialized views for high-performance analytical workloads. Executed large-scale data migration and integration across Oracle databases, flat files, and third-party systems with robust error handling and reconciliation frameworks. Ensured ETL accuracy and data integrity through validation rules, reconciliation processes, and comprehensive audit logging mechanisms. Designed and secured REST and SOAP-based Oracle APIs to enable real-time data exchange across internal applications, third-party systems, and enterprise reporting platforms. Utilized Python (Pandas, Requests, and custom parsing scripts) to extract, cleanse, and standardize data from APIs, flat files, and semi-structured sources for downstream ETL processing. Automated enterprise batch workflows using Autosys, implementing job scheduling, dependency management, SLA monitoring, alerting, and failure recovery mechanisms. Leveraged GitHub Copilot to accelerate PL/SQL, Python, and ETL development, improving code quality and reducing development cycles. Configured and managed AWS RDS (Oracle), including database migration, performance tuning, backup and recovery, monitoring, and secure connectivity. Documented enterprise architecture by developing business process flows, data lineage diagrams, and application architecture documentation using Microsoft Visio, supporting design reviews, audits, and compliance initiatives. Environment: Oracle 19c Exadata, AWS RDS, EC2, S3, Informatica 10x, Python, Toad 16, GitHub, Jira, AutoSys, Visio, Unix, Windows, XML, Excel, Visual studio 15. Abbvie INC , Chicago,IL March 2023 May 2024 Lead Data Engineer Responsibilities: Designing, developing, and maintaining PL/SQL code and database objects such as tables, views, stored procedures, functions, triggers, and packages. Identifying and resolving performance issues by analyzing and optimizing SQL queries, tuning database parameters, and optimizing PL/SQL code for efficient execution. Collaborating with database administrators and stakeholders to design efficient and scalable database structures, including data modeling, normalization, denormalization, and partitioning strategies. Built and optimized PL/pgSQL functions, procedures, triggers, and SQL queries using indexing, partitioning, CTEs, and EXPLAIN ANALYZE for high-volume data processing. Written queries using Postgres 14 and written & implemented PL/PostgreSQL Procedures. Partitioned the fact tables and materialized views to enhance the performance. Extensively used bulk collection in PL/SQL objects for improving performance. Created records, tables, collections (nested tables and arrays) for improving Query performance by reducing context switching. Used Pragma Autonomous Transaction to avoid mutating problems in database triggers. Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application. Supported Data Warehouse by developing Packaged Procedures to Loading data from Stage table to Fact table with Validations. Create and manage JSON data structures within Oracle databases, including tables, columns, and indexes. Develop PL/SQL procedures, functions, and packages to handle JSON data processing, validation, and transformation. Write efficient SQL queries and optimize performance for JSON data retrieval and manipulation. Provided production support by troubleshooting and resolving database-related issues, addressing user queries, and ensuring database availability and performance. parsers written in python for extracting useful data from the design database. Used Airflow to schedule and monitor workflows. Developed code fixes and enhancements for inclusion in future code releases and patches. Environment: Oracle 19c/12c, PostgreSQL 14, Informatica 10x, Oracle LSH, Python, Toad 9/11, Git, DevOps, Jira, Airflow, Windows, XP, XML, Excel, Visual studio 15/17, Unix, AIX, Visio Baker-Taylor Feb 2021 Feb 2023 Senior Data Engineer Responsibilities: Designing, developing, and maintaining PL/SQL code and database objects such as tables, views, stored procedures, functions, triggers, and packages. Built and optimized PL/pgSQL functions, procedures, and SQL queries in PostgreSQL 12 using indexing, partitioning, and performance analysis tools. Collaborating with database administrators and stakeholders to design efficient and scalable database structures, including data modeling, normalization, die-normalization, and partitioning strategies. Attended CDMS and EDW Production support and bugs are fixed by applying hotfixes. Monitoring of ETL feeds and fixing the production ETL Feeds failure by identifying the problem by checking the logs, source directories and any database space issues. Identifying and resolving performance issues by analyzing and optimizing SQL queries, tuning database parameters, and optimizing PL/SQL code for efficient execution. Performed SQL and PL/SQL tuning and Application tuning using various tools like EXPLAIN PLAN, SQL*TRACE, TKPROF and AUTOTRACE. Working on SQL*Loader & Eternal tables to load data from flat files obtained from various facilities every day. Developed and implemented reporting, analytics and data visualization solutions with Power BI. Partitioned the fact tables and materialized views to enhance the performance. Extensively used bulk collection in PL/SQL objects for improving performance. Handled errors using Exception Handling extensively for the ease of debugging and displaying the error messages in the application. Supported Data Warehouse by developing Packaged Procedures to Loading data from Stage table to Fact table with Validations. Developed, implemented and maintained SSIS/DTS Packages to Extract Data from SQL Servers. Used version controlling applications like GitHub, DevOps. Created interactive and visually appealing reports and dashboards in SSRS, Apache Airflow Create and manage JSON data structures within Oracle databases, including tables, columns, and indexes. Implement JSON parsing and serialization/deserialization techniques in Oracle applications. parsers written in python for extracting useful data from the design database. Used Airflow to schedule and monitor workflows. Developed code fixes and enhancements for inclusion in future code releases and patches. Environment: Oracle 12c, PostgreSQL 12, Informatica 10x, AWS, SQL server (2004/2006/2016), SSIS, SSRS, JSON, REST API, Toad 9/11, Git, DevOps, Jira, Airflow, Windows, XP, XML, Excel, Visual studio 15/17, Unix, AIX, Visio. Burlington Northern and Santa Fe Railway (BNSF), Dallas, TX Dec 2018 Jan 2021 Senior Data Engineer/ Oracle Developer Responsibilities: Involved in a full development cycle (Agile) of Planning, Analysis, Design, Development, Testing and Implementation. Attended all types of production issues and problems solved by applying hotfixes. Designed logical and physical data models for star and snowflake schemas using Erwin. Wrote sequences for automatic generation of unique keys to support primary and foreign key constraints in data conversions. Created and modified SQL*Plus, PL/SQL and SQL*Loader scripts for data conversions. Upgraded Oracle 9i to 10g software in different environments for latest features and tested databases. parsers written in python for extracting useful data from the design database Developed and implemented reporting, analytics and data visualization solutions with splash BI. Designed Data Modeling, Design Specifications and to analyze Dependencies. Creating indexes on tables to improve the performance by eliminating the full table scans and views for hiding the actual tables and to eliminate the complexity of the large queries. Fine-tuned procedures/SQL queries for maximum efficiency in various databases using Oracle Hints, for Rule based optimization. Migrated different Objects from SQL server database to oracle database. Wrote queries, and stored procedures and followed company reporting and analytical needs. Used version controlling applications like GitHub, DevOps. Created interactive and visually appealing reports and dashboards in SSRS, Power BI Desktop. Unit Testing, Integration Testing, Performance Testing, Regression Testing done. Developed SSIS Packages, AIX Shell scripts to load data into Data Warehouse from Different Sources. Automated the ETL Pipeline process by Scheduling the Cron Jobs. Created some Custom reports and Forms for the end users to check the details and errors. Worked closely with clients to establish problem specifications and system designs. Environment: 0racle 9i/10g, Informatica 9x/10x, SSIS, SSRS, Visual studio 15/17, JSON, Python, GitHub, Airflow, REST API, SQL Server 2009/2016, Erwin 7.5.8, Windows 7, XML, Excel, Visio Hospital Information System , Ministry of Health,Oman March 2012 Nov 2018 Senior Data Engineer/ Oracle Developer Responsibilities: Designing, developing, and maintaining Oracle databases using PL/SQL. This involves creating tables, views, indexes, and other database objects to store and manage hospital-related data. Data transformation, conversion and interface, data loading, database modeling and performance tuning. Designing, developing, and maintaining ETL (Extract, Transform, Load) processes using IBM Infosphere DataStage 9.x Analyzing source system data structures, mapping data elements to target systems, and ensuring data consistency and integrity across multiple systems. extracting data from different source systems (such as databases, flat files, or APIs) using Infosphere DataStage's connectors and stages. You perform necessary data transformations, including data cleansing, aggregation, data formatting, and implementing business rules. creating data integration workflows, transforming data from various sources, and loading it into target using IBM Infosphere DataStage 9.x Optimizing the performance of DataStage jobs is a critical responsibility. You identify performance bottlenecks, fine-tune data integration processes, optimize SQL queries, and enhance job performance through parallel processing, partitioning, or caching techniques. Written complex queries using Postgres 9.x and written & implemented PL/ PostgreSQL Procedures. Integration of Different Applications with XML and Web Services. Development of complex reports to meet the user requirement by using oracle Reports 6i/9i Creating Tables, Views, Materialized Views, Global temporary tables. Coding of Procedure and Packages in PL/SQL developer. Migrated different Objects from SQL server database to oracle database. Developed and maintained software components of the HRP platform. This involves writing code, designing and implementing new features, and enhancing existing functionalities. Environment: Oracle 11g, PL/SQL, IBM Infosphere DataStage 9.x, SSIS, Visual studio 13, Control M, Forms 9i, Reports 9i, Windows XP, Windows 10 and MS Visio Inter-tech, Muscat,Oman April 2009 Feb2012 Orac le Databaase Developer Responsibilities: Designing, developing, and maintaining Oracle databases using PL/SQL. This involves creating tables, views, indexes, and other database objects to store and manage hospital-related data. Performing code reviews with analysts and application architects. Development of applications using the forms 6i/9i and reports 6i/9i on Oracle 11g. Development of unit and system test plans. Data transformation, conversion and interface, data loading, database modeling and performance tuning. Development of complex reports to meet the user requirement by using oracle Reports 6i/9i Creating Tables, Views, Materialized Views, Global temporary tables. Coding of Procedure and Packages in PL/SQL developer. Integrating Procedures with Cursors. Created triggers as per the Client specification. Created functions to meet the business logic. Involving in integration of modules, unit testing and system testing Environment: Oracle 11g, PL/SQL, IBM Infosphere DataStage 9.x, SSIS, Visual studio 13, Control M, Forms 9i, Reports 9i, Windows XP, Windows 10 and MS Visio Fore C Soft, Chennai,India Nov 2006 March 2009 Orac le Databaase Developer Responsibilities: Designing, developing, and maintaining Oracle databases using PL/SQL. This involves creating tables, views, indexes, and other database objects to store and manage hospital-related data. Performing code reviews with analysts and application architects. Development of applications using the forms 6i/9i and reports 6i/9i on Oracle 11g. Data transformation, conversion and interface, data loading, database modeling and performance tuning. Development of complex reports to meet the user requirement by using oracle Reports 6i/9i Creating Tables, Views, Materialized Views, Global temporary tables. Coding of Procedure and Packages in PL/SQL developer.Integrating Procedures with Cursors. Created functions to meet the business logic.Involving in integration of modules, unit testing and system testing Environment: Oracle 11g, PL/SQL, Forms 6i/9i, Reports 6i/9, Windows XP, Windows 10 and MS Visio Keywords: cprogramm continuous integration continuous deployment business intelligence sthree information technology logistics execution microsoft mississippi procedural language Illinois South Carolina Texas |