Home

ushashreee - Data Analyst/Data Engineer/Tableau Developer/Admin
[email protected]
Location: Remote, Remote, USA
Relocation:
Visa: H4 EAD
Resume file: gUshasree ATS_Resume_1775564788078.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
USHASREE GANDLA
Senior Data Engineer | Data Analyst
[email protected] | 469-733-9971 | Allen, TX

PROFESSIONAL SUMMARY

Results-driven Senior Data Engineer with 10+ years of experience building scalable data pipelines, engineering cloud-native data platforms, and delivering analytics solutions across banking, insurance, and healthcare domains. Deep expertise in Snowflake, AWS (S3, Airflow), Databricks, and Python for end-to-end ETL/ELT pipeline development, data modeling, and workflow orchestration. Proven track record at Capital One, Copart, and City National Bank designing ingestion frameworks, dimensional data models, and Tableau-driven BI layers. Skilled at translating complex business requirements into well-defined datasets, enforcing data quality standards, and collaborating cross-functionally with Product, Engineering, Data Science, and Operations teams.
TECHNICAL SKILLS

Data Engineering Apache Airflow, Spark SQL, ETL/ELT Pipelines, Batch & Near-Real-Time Ingestion
Cloud Platforms AWS (S3, Glue, Lambda), GCP (BigQuery, Dataflow, App Engine), Snowflake, Databricks
Databases & SQL Snowflake, Oracle, MS SQL Server, BigQuery, Cassandra, MySQL, PL/SQL, T-SQL
Languages Python (Pandas, NumPy, SQLAlchemy), SQL, PL/SQL, Unix Shell Script
BI & Visualization Tableau Desktop/Server (v6.x 10.5x), LOD Calculations, KPIs, Scorecards, Dashboards
Data Modeling Dimensional Modeling, Star/Snowflake Schema, Fact & Dimension Tables, Data Contracts
Orchestration Apache Airflow (DAGs, retries, backfills, idempotency), EMP Jobs, Autosys, Control-M
ETL Tools SSIS, Informatica PowerCenter, SnowSQL, Pentaho
Data Governance Data Quality Rules, Profiling, Root-Cause Analysis, Data Classification, MDM
DevOps & Collab GitHub, Jupyter Notebook, JIRA, Confluence, Agile/Waterfall, CI/CD
EDUCATION
Master's Degree | Kakatiya University, Warangal, India | 2010
PROFESSIONAL EXPERIENCE

Capital One | Plano, TX May 2025 Present
Sr. Data Engineer / Dev Engineer DFS & COF Integration
Architected and implemented end-to-end automated data pipelines in Databricks and Snowflake to reconcile Trust Accounts, Bank Accounts, and Credit Card Customers across Discover Financial Services (DFS) and Capital One (COF) platforms post-acquisition.
Wrote complex Spark SQL queries in Databricks to identify and de-duplicate overlapping customer and account records at scale, enabling accurate migration reconciliation across 100M+ customer base.
Developed Python ingestion components to extract data from Snowflake, stage to Amazon S3, and trigger automated secure file transfers via internal TADA APIs achieving zero-manual-intervention pipeline execution.
Designed and configured EMP Job orchestration with automated scheduling, error handling, and retry logic for data extraction, transformation, and transfer workflows; ensured idempotent pipeline runs.
Developed detailed source-to-target data mapping documents for DFS and COF account and customer datasets, defining field-level transformations, data types, and business rules to guide pipeline development and ensure alignment across engineering and migration teams.
Enforced data governance by serving as Data Guardian for 100+ Exchange Datasets reviewing, classifying, and approving datasets per Capital One's data classification framework and security protocols.
Built reusable Python utility modules for data validation, transformation, and secure file handling; conducted code reviews and established standards adopted by the broader migration team.
Established and monitored data quality checks at each pipeline stage; performed root-cause analysis on anomalies and drove remediation in coordination with owning teams.
Mentored junior team members on data governance best practices, pipeline standards, and data handling guidelines for the migration program.
Technologies: Snowflake, Databricks, Spark SQL, Amazon S3, Python, Jupyter Notebook, TADA APIs, EMP Jobs, Data Governance
Capital One | Plano, TX Oct 2023 Dec 2024
Data Analyst / Data Engineer Auto Finance Internal Services
Designed and built ETL/ELT pipelines using Python and Snowflake SnowSQL to migrate customer complaint and call metrics data from legacy Cassandra systems into Snowflake Data Warehouse on AWS.
Developed Apache Airflow DAGs for pipeline orchestration including scheduling, automated backfills, and retry configurations replacing manual execution for weekly and monthly reporting workflows.
Engineered ingestion patterns from Amazon S3 into Snowflake, automating file uploads using Python Airflow; supported near-real-time data handling across S3 buckets and Snowflake target tables.
Designed curated dimensional schemas and new tables/schemas in Snowflake to support QA and business analytics on auto finance customer complaints and call monitoring controls.
Wrote optimized SQL queries in Snowflake for data quality validation and reconciliation, comparing legacy data against migrated cloud datasets to ensure accuracy.
Developed and maintained interactive Tableau dashboards and scorecards (stack bars, scatter plots, Gantt charts, geographic maps) for QA and business teams; managed data extract schedules and incremental refresh on Tableau Server.
Applied Python (Pandas, NumPy) for data wrangling, cleaning, and transformation; developed reusable data frame visualization toolkits.
Maintained source code versioning in GitHub; documented pipeline logic and schema definitions to establish clear data contracts with consuming teams.
Technologies: Snowflake, AWS S3, Apache Airflow, Python (Pandas, NumPy), Cassandra, Tableau Desktop/Server, GitHub, SnowSQL
City National Bank | New York, NY (Remote) Jan 2022 Oct 2023
Data Analyst / Data Engineer
Led full SDLC for integrating monthly account statements with customer accounting systems and implementing a new check management process spanning multiple source systems.
Designed and developed SSIS packages for extracting data from flat files and loading into SQL Server, Oracle DB, and Excel targets including Conditional Split, Derived Column, Lookup, and Merge transformations.
Performed detailed data mapping and created DMD (Data Mapping Documents) to define source-to-target data flows and business logic, enabling seamless collaboration between engineering and business teams.
Authored and optimized complex SQL/PL-SQL queries for ad-hoc reporting, data validation, and profiling; created views to abstract multi-table joins and business logic for downstream consumers.
Executed GAP analysis across As-Is and To-Be states; gathered requirements from SMEs and translated business needs into technical data flow specifications.
Leveraged GCP (BigQuery) and Oracle DB for cross-platform data integration; used Unix Shell scripting for data-centric automation and file-handling workflows.
Delivered post-implementation production support and performance monitoring, ensuring data pipeline reliability across integrated financial systems.
Technologies: SQL Server, Oracle DB, SSIS, PL/SQL, GCP, Unix Shell Scripting, D365, MDM, Excel/Flat Files
Copart | Dallas, TX Mar 2017 Dec 2021
Tableau Developer / Data Analyst
Designed and developed business intelligence dashboards and scorecards in Tableau Desktop using data from GCP BigQuery and Snowflake, enabling executive management to track past, current, and forecast vehicle auction sales data.
Wrote Python scripts to query datasets in BigQuery and third-party APIs; deployed Python applications in Google App Engine on GCP; integrated Google DialogFlow Enterprise chatbots with GCP webhook fulfillment.
Performed SQL and PL/SQL query tuning and optimization to improve system performance and data mart load times; used Toad for warehouse data validation.
Implemented MDM (Master Data Management) strategy through roadmap, architecture design, and delivery using Informatica MDM Hub 9.0 as the core hub; applied MDM to Facets data model.
Established and enforced data governance standards; performed data quality testing, gap analysis, and coordinated with data origination teams on remediation.
Created interactive Tableau dashboards with guided navigation, Table of Contents, and published to Tableau Server with scheduled data refresh.
Automated support activities and scripted file manipulation using Unix Shell/Perl scripting; managed Autosys job scheduling.
Worked across multiple DW architectures including GCP BigQuery, Redshift, and Snowflake; used Pentaho for data integration and migration tasks.
Technologies: Tableau Desktop/Server, GCP, BigQuery, Snowflake, Python, SQL, PL/SQL, Cassandra, Informatica PowerCenter, Unix Shell, Control-M
Gateway Health Plan | Pittsburgh, PA Dec 2015 Mar 2017
Tableau Developer / Data Analyst
Developed and published interactive Tableau dashboards, reports, and workbooks from Tableau Desktop to Tableau Server for healthcare member experience and disease management analytics.
Designed complex LOD (Level of Detail) calculations, KPI scorecards, aggregations, table calculations, and parameters to meet business intelligence reporting requirements.
Collaborated cross-functionally with business stakeholders to translate healthcare program requirements into BI reporting solutions; evaluated Tableau vs. Power BI toolsets for optimal fit.
Developed complex mappings/sessions using Informatica PowerCenter for healthcare data loading; supported ETL using dimensional modeling (star and snowflake schemas).
Performed performance tuning on large-query reports using concurrent query execution; participated in OLAP (MOLAP, HOLAP, ROLAP) implementations.
Conducted end-user training on Tableau filtering, sorting, and interactive visualization; supported healthcare claims data analysis and program effectiveness reporting.
Technologies: Tableau Desktop/Server, SQL Server, LOD Calculations, KPIs, Informatica PowerCenter, OLAP, Star/Snowflake Schema
Visa Inc. | Austin, TX Oct 2014 Dec 2015
SQL Server Developer
Developed complex stored procedures for index maintenance, data profiling, metadata search, and staging critical pipelines for loading data marts and reporting databases.
Designed and implemented SSIS packages (Conditional Split, Derived Column, SCD Type 2) for data scrubbing, validation, and historical data management in the data warehouse.
Automated ETL processes via SQL Server Agent job scheduling; built Master/Child SSIS package architecture for orchestrated pipeline execution.
Implemented SSISDB catalog with environment variables for multi-environment SSIS deployments; used Breakpoints, Checkpoints, and Event Handlers for debugging and performance optimization.
Technologies: SQL Server 2008R2/2012, SSIS, SQL Server Agent, C#.NET, SharePoint, TFS, Power Pivot
Keywords: csharp continuous integration continuous deployment quality analyst business intelligence sthree database active directory microsoft mississippi procedural language Delaware New York Pennsylvania Texas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];7119
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: