| Sowmya Marripeddi - Sr Data Analyst |
| [email protected] |
| Location: Jersey City, New Jersey, USA |
| Relocation: |
| Visa: |
| Resume file: Sowmya_Marripeddi_Senior_Data_Analyst_Resume_1771523579855.pdf Please check the file(s) for viruses. Files are checked manually and then made available for download. |
|
owmya Marripeddi
Senior Data Analyst & Business Analyst +(315)-601-9343 [email protected] linkedin.com/in/sowmya-marripeddi-business- dataanalyst PROFESSIONAL SUMMARY Over 10+ years of experience as a Data Analyst delivering actionable insights, building KPI frameworks, and translating complex business requirements into data-driven solutions across retail, financial services, healthcare, and enterprise domains. Partnered with business stakeholders, product owners, and executive leadership to gather requirements, define success metrics, and deliver analytical reporting solutions that directly influenced strategic decisions and operational outcomes. Consolidated data from heterogeneous sources (MS SQL, MongoDB, Snowflake, cloud storage) into analysis-ready datasets using SQL and Python, reducing manual data preparation effort by 95% and improving reporting turnaround time. Developed advanced SQL and Python-based analytical logic for large-scale reporting workloads, improving report generation performance by 3x and supporting critical business intelligence, forecasting, and predictive modeling initiatives. Designed and maintained Power BI and Tableau dashboards with DAX measures, drill-through filters, and role-level security, enabling self-service analytics and real-time KPI visibility for cross-functional business teams. Conducted end-to-end business analysis including gap analysis, root cause analysis, process mapping, and user story authoring to align analytical solutions with business objectives and drive continuous process improvement. Performed customer segmentation, fraud pattern analysis, and behavioral analytics using Python and SQL, delivering insights that supported fraud detection workflows and improved operational decision-making for business stakeholders. Optimized reporting environments and query performance through indexing, view creation, and data model refinement; reduced compute cost by 30% while maintaining high availability for business-critical reporting workloads. Maintained data quality and governance standards leveraging metadata management, access controls, and data classification practices to ensure data privacy, lineage, and compliance with enterprise policies and industry regulations. Collaborated in Agile cross-functional teams to deliver high-quality analytics products through iterative releases; contributed to sprint planning, requirements grooming, UAT coordination, and stakeholder sign-off workflows. Applied statistical modeling, A/B testing, cohort analysis, and funnel analysis to uncover trends, identify business opportunities, and validate hypotheses for product, marketing, and operations teams. Championed data literacy across business units by establishing self-service reporting standards, creating data dictionaries, and delivering training on Power BI and Tableau accelerating time-to-insight and enabling faster decision-making. Technical expertise includes: Power BI, Tableau, SQL (T-SQL, PL/SQL, SnowSQL), Python (Pandas, NumPy, Scikit-learn, Matplotlib), DAX, Excel (Advanced), Snowflake, BigQuery, Statistical Analysis, Data Governance, and Agile SDLC. SKILLS Programming & Query Languages: SQL (T-SQL, PL/SQL, SnowSQL, HiveQL), Python (Pandas, NumPy, Matplotlib, Seaborn, Scikit-learn, Plotly), R (ggplot2, dplyr, tidyr, caret), DAX, MDX, VBA, Excel Macros BI, Reporting & Visualization Tools: Power BI (DAX, Power Query, RLS, Paginated Reports, Dataflows, Deployment Pipelines), Tableau (LOD Expressions, Calculated Fields, Story Points, Prep Builder), SSRS, Looker, Google Data Studio, Qlik Sense, Excel (PivotTables, Power Pivot, VLOOKUP, Index-Match, Solver, Advanced Charting), MicroStrategy Databases, Warehouses & Data Sources: Relational (SQL Server, Oracle, MySQL, PostgreSQL, Teradata, MS Access), Cloud Warehouses (Snowflake, Google BigQuery, Azure Synapse Analytics), NoSQL (MongoDB, Cosmos DB), Data Modeling (Star Schema, Snowflake Schema, SCD Types I & II, Dimensional Modeling), OLAP Cubes (SSAS) Analytics, Statistics & Business Analysis: Requirements Gathering, Gap Analysis, Root Cause Analysis, Process Mapping, User Stories, UAT Coordination, A/B Testing, Hypothesis Testing, Cohort Analysis, Funnel Analysis, Customer Segmentation, Forecasting & Trend Analysis, KPI Framework Design, Stakeholder Reporting, Data Storytelling Collaboration, Governance & Data Science: Agile (Jira, Confluence, Sprint Planning), Waterfall SDLC, Data Governance (Purview, Alation), Data Quality & Validation, Data Dictionary Management, Statistical Modeling, Machine Learning (Regression, Classification, Clustering, Forecasting), Feature Engineering, MLflow, Cross-functional Collaboration EDUCATION JNTU Kakinada Bachelor s Degree in Computer Science and Engineering June 2009 Aug 2013 Kakinada, IN Relevant Coursework: Data Structures and Algorithms, Cloud Computing, Distributed Systems, Database Systems, Data Warehousing, Big Data Analytics, Machine Learning, Statistical Modeling, Data Mining, ETL Design, Data Governance. WORK EXPERIENCE Senior Business Intelligence Analyst Jan 2025 Present UBS | Weehawken, NJ At UBS, I led reporting and analytics initiatives across wealth management, investment research, and client portfolio intelligence. My work focused on translating complex financial data into actionable insights for advisors and executives, accelerating client reporting cycles, and ensuring compliance with SEC, FINRA, and internal risk mandates. Analyzed large-scale client transaction and portfolio datasets using SQL and Python to deliver performance attribution insights and investment trend reports for wealth management teams. Consolidated global client and market datasets from 10+ sources into a unified reporting layer, reducing reporting cycle time by 25% and improving data accessibility across regional advisory desks. Designed and maintained Tableau and Power BI dashboards to analyze client holdings, risk exposure, and market data, directly supporting the personalized wealth advisory initiative for portfolio managers. Implemented data governance standards for lineage, access control, and metadata management, enhancing audit readiness and secure collaboration across compliance and front-office departments. Partnered with compliance teams to build regulatory reporting views for trade surveillance and anomaly monitoring using SQL, accelerating SEC and FINRA submission workflows. Performed real-time behavioral analytics on market data feeds and client activity to surface predictive alerts, enabling time-sensitive trading insights and proactive advisor interventions. Automated recurring investment product performance reports using Python and SQL, enhancing data reliability and reducing manual reporting effort by 40% across the reporting supply chain. Conducted post-trade analysis, client engagement tracking, and product usage analytics to surface actionable patterns for business stakeholders and relationship managers. Built self-service Tableau dashboards to visualize portfolio performance trends, streamlining regulatory submission reporting cycles for SEC and FINRA approvals. Applied statistical modeling and regression analysis to detect early indicators of client churn and portfolio risk, enabling proactive advisor interventions and improving client retention in wealth management segments. Standardized reporting delivery workflows and version control practices, improving turnaround speed and production reliability for time-critical financial reporting. Documented data definitions, business rules, and reporting standards, ensuring consistent metric interpretation across compliance, risk, and front-office teams. Enforced role-based access controls and data security standards to protect client financial data, achieving strict compliance with internal risk controls and regulatory data security requirements. Environment: SQL Server, Snowflake, BigQuery, Python (Pandas, NumPy, Scikit-learn), Tableau, Power BI, DAX, Excel (Advanced), PostgreSQL, Statistical Modeling, Data Governance, Financial Reporting, Jira, Confluence. Senior Data Analyst Sep 2023 Jan 2025 Citizens Bank | Jersey City, NJ At Citizens Bank, I led enterprise analytics and reporting initiatives across retail banking, fraud operations, and customer intelligence. My work focused on delivering self-service BI dashboards, building KPI frameworks, and enabling data-driven decision-making across diverse business units while enforcing data governance and compliance standards. Consolidated 10+ internal and external data sources into a unified reporting layer using SQL and Python, improving analytics accessibility by 40% and streamlining cross-domain reporting for business stakeholders. Designed and maintained dynamic reporting frameworks with parameterized queries and reusable data models, accelerating dashboard deployment timelines by 35% across evolving business domains. Performed large-scale data cleansing, deduplication, and transformation using SQL and Python to ensure high-quality, reliable inputs for executive reporting and business intelligence workloads. Implemented data governance standards including lineage tracking, access control policies, and metadata documentation, aligning analytics outputs with enterprise-wide compliance goals. Conducted behavioral analytics and fraud pattern analysis using statistical methods, enabling instant fraud detection workflows and improving customer response time by 25%. Designed optimized SQL queries and data models with partitioning and indexing strategies, powering mission-critical dashboards and reducing report load times for business teams. Enforced data retention and archival policies in coordination with compliance teams, reducing storage overhead by 30% while meeting regulatory data lifecycle requirements. Streamlined analytics delivery workflows and reporting version control, reducing errors and improving reporting agility by 40% across business units. Ensured data security and privacy compliance through RBAC implementation and access auditing, aligning reporting outputs with GDPR and CCPA regulatory standards. Created self-service Power BI dashboards for KPI monitoring, giving stakeholders real-time visibility and boosting operational efficiency and business responsiveness. Developed Python-based data validation scripts for automated quality checks and anomaly detection, reducing manual QA time by 15% and increasing trust in enterprise reporting outputs. Established standardized reporting templates and data quality frameworks across business domains, cutting reporting inconsistencies and data friction by 40%. Leveraged data cataloging tools to improve data discoverability and metric traceability, cutting issue resolution time by 35% for business and compliance teams. Enriched customer profiles by integrating third-party financial data with internal datasets using SQL joins and Python, driving a 15% increase in upsell and cross-sell conversions. Monitored reporting SLAs and data freshness using alerting dashboards, achieving 20% downtime reduction and proactive issue resolution for business-critical reports. Applied clustering, KNN, PCA, and linear regression techniques in Python to support fraud detection models and customer risk profiling for the analytics and risk teams. Environment: SQL Server, Snowflake, Python (Pandas, NumPy, Scikit-learn), Power BI, DAX, Tableau, Excel (Advanced), Statistical Modeling, Data Governance, Data Quality, Jira, Confluence, GDPR, CCPA. Data Analyst Aug 2021 Sep 2023 Johnson & Johnson | Santa Clara, CA At Johnson & Johnson, I led analytics initiatives across pharmaceutical supply chains, clinical trial reporting, and public health data systems. My work focused on translating complex healthcare data into actionable insights to improve patient outcomes, accelerate drug development reporting cycles, and ensure compliance with HIPAA and FDA mandates. Analyzed large-scale genomic and clinical trial datasets using SQL and Python to generate medical insights and precision medicine research reports for R&D and clinical teams. Integrated global trial datasets from multiple geographies into a unified reporting layer, reducing research reporting cycle time by 25% and improving data access for cross-regional study teams. Designed Tableau and Power BI reporting views to analyze patient records and real-world evidence, directly supporting the personalized medicine initiative and streamlining insights for research stakeholders. Implemented data governance standards for lineage, access control, and metadata documentation, enhancing audit readiness and enabling secure cross-departmental collaboration. Built pharmacovigilance reporting views for adverse drug event monitoring by consolidating transactional and reference datasets using SQL, accelerating compliance submission workflows. Conducted logistics and supply chain analytics by analyzing pharmaceutical delivery and device telemetry data, enabling predictive insights and minimizing supply delays. Automated recurring data preparation workflows for manufacturing quality reporting using Python scripts, enhancing data reliability and ensuring traceability across the supply chain. Performed post-market surveillance analysis, patient adherence tracking, and device usage reporting to support regulatory and operations stakeholders. Built self-service Tableau dashboards to visualize clinical trial outcomes and streamline FDA submission reporting cycles for regulatory approvals. Applied statistical modeling to detect early indicators of chronic illness and enable predictive care planning, improving patient outcome reporting in clinical settings. Standardized analytics delivery and reporting version control workflows, improving turnaround speed and production reliability for regulatory and business reporting. Documented data dictionaries, business rules, and reporting standards, ensuring HIPAA-compliant data handling and consistent metric definitions across departments. Enforced role-based access controls and data auditing practices to protect patient data, achieving strict compliance with HIPAA and internal data security standards. Environment: SQL Server, PostgreSQL, Python (Pandas, NumPy, Scikit-learn), Tableau, Power BI, DAX, Excel (Advanced), Statistical Modeling, Data Governance, HIPAA Compliance, Jira, Confluence. Data Analyst Aug 2020 Aug 2021 First Republic Bank | San Francisco, CA At First Republic Bank, I delivered analytics and reporting solutions to support compliance reporting, financial analytics, and customer data analysis. My work ensured timely, accurate reporting and audit compliance during surging digital banking demand, enabling business leaders to make faster, data-driven decisions. Queried and transformed large volumes of customer and loan data from SQL Server using T-SQL and Python, improving data availability for regulatory compliance and BI reporting by 30%. Developed and optimized SQL-based analytical queries for data cleansing, validation, and fraud detection use cases, significantly improving reporting performance and analytical consistency. Standardized reporting delivery and testing workflows, reducing manual overhead and ensuring consistent, error-free report releases across business teams. Implemented automated data quality checks using Python scripts and rule-based validation logic, decreasing data errors and manual intervention in regulatory reporting flows. Maintained data quality checks and lineage documentation across all reporting outputs, supporting FDIC audits, SOX controls, and compliance submissions. Automated loan data analysis and reporting processes, enabling operational resilience and a 40% increase in reporting throughput during peak pandemic-period application surges. Enriched customer risk profiles by joining third-party financial feeds with internal datasets using SQL, improving customer segmentation and credit risk analytical outputs. Designed reusable data mapping templates for legacy-to-cloud data migration analysis, ensuring consistency, accuracy, and compliance across staging and reporting layers. Optimized SQL reporting queries through indexing, query restructuring, and execution plan analysis to maintain consistent SLAs for regulatory and treasury reporting. Developed Power BI dashboards for daily KPI reporting to branch leadership, driving better decision-making and real-time financial visibility. Enforced data governance practices including access controls, retention policies, and audit logging, reducing compliance risk and ensuring data policy adherence. Environment: SQL Server, Oracle, Python (Pandas, NumPy), Power BI, DAX, Excel (Advanced), SSIS, Data Governance, Compliance Reporting, SOX, FDIC, Jira. Data Analyst Feb 2018 Aug 2020 Kaiser Permanente | Oakland, CA At Kaiser Permanente, I delivered analytics and reporting solutions across clinical, supply chain, and operational domains. My work supported predictive healthcare reporting, HIPAA-compliant data governance, and patient engagement analytics, helping improve patient outcomes and operational efficiency across 50+ hospital units. Analyzed real-time clinical data streams to surface decision-support insights for emergency care teams, reducing response latency by 20% through faster data availability and cleaner reporting. Enriched patient profiles by joining lab, medication, and imaging datasets using SQL and Python, enabling holistic population health insights and advanced patient segmentation for care teams. Automated appointment reminder and care follow-up reporting workflows using Python scripts, reducing patient readmissions and supporting value-based care program tracking. Implemented data cataloging and governance standards for patient datasets, enforcing data classification, lineage, and access controls for audit-ready HIPAA-compliant analytics. Designed clinical and operational KPI dashboards in Power BI, improving care coordination visibility and enabling data-driven decisions for frontline clinical and operations staff. Built a centralized reporting layer consolidating data from multiple clinical and operational systems to support performance reporting and outcome-based analysis across 50+ hospital units. Developed patient risk scoring models using Python (regression, clustering) to support chronic disease prevention programs and reduce hospital readmissions. Optimized SQL query performance and reporting workloads through indexing and view creation, ensuring SLA compliance during high-volume clinical reporting periods. Applied data governance frameworks including role-based access controls, metadata management, and audit logging to maintain HIPAA-aligned data security and lineage visibility. Automated data quality monitoring and anomaly detection using Python, reducing downstream reporting errors and improving accuracy of regulatory compliance submissions. Standardized analytics delivery and version control practices for reporting outputs, ensuring reliable deployment of dashboards and analytical views across clinical environments. Integrated EHR data for bidirectional patient data analysis using FHIR-compliant data extracts, enhancing reporting completeness and reducing patient data silos across departments. Environment: SQL Server, PostgreSQL, Python (Pandas, NumPy, Scikit-learn, Matplotlib), Power BI, DAX, Tableau, Excel (Advanced), Statistical Modeling, Data Governance, HIPAA, FHIR, Jira, Confluence. Business Intelligence Analyst Jan 2017 Feb 2018 Guild Mortgage | San Diego, CA At Guild Mortgage, I modernized reporting capabilities and delivered scalable analytics solutions to support risk analytics, loan portfolio analysis, and regulatory compliance. My focus was on improving reporting speed, building self-service BI dashboards, and enabling data-driven decision-making for credit risk, finance, and executive teams. Extracted and analyzed large-scale mortgage data from Oracle and PostgreSQL using SQL, enabling loan performance analytics and risk scoring reports used by credit risk teams and auditors. Redesigned legacy reporting workflows using Python and SQL, significantly reducing report generation latency and enabling faster compliance submissions to federal regulators and internal stakeholders. Developed batch analytics reports for loan trend analysis, improving reporting performance 10x over legacy processes and enabling near real-time loan insights for executive dashboards. Built advanced SQL and Python-based analytical models for forecasting delinquency, servicing trends, and mortgage application funnel performance, directly impacting portfolio management strategies. Analyzed real-time mortgage application data to support faster credit decisions, fraud detection workflows, and instant loan status reporting for customers and relationship managers. Migrated reporting datasets to cloud storage, enhancing data durability, enabling cloud-based analytics, and supporting business continuity planning. Modeled structured mortgage data in Snowflake to support interactive BI dashboards and compliance reports used across finance, risk, and executive teams for strategic decision-making. Wrote complex HiveQL and SQL queries over partitioned datasets to generate detailed reports for loan servicing, credit risk, default probability tracking, and investor performance summaries. Created reusable analytical functions in Python to implement domain-specific mortgage processing logic and automate exception handling for edge-case reporting scenarios. Optimized SQL report performance through query restructuring, efficient join strategies, and execution plan analysis, ensuring stability during peak reporting windows. Environment: SQL Server, Oracle, PostgreSQL, Snowflake, Python (Pandas, NumPy), Tableau, Power BI, DAX, Excel (Advanced), Informatica, Jira, BI Reporting, Data Governance. BI Developer Sep 2013 Oct 2015 Value Labs | Hyderabad, India At Value Labs, I built high-performance business intelligence solutions and reporting frameworks for enterprise clients. My work empowered business teams with self-service analytics, optimized reporting workflows, and consistent data models, significantly enhancing decision-making processes across departments. Developed advanced stored procedures, functions, and views in SQL Server to improve query efficiency and report responsiveness for mission-critical business operations and analytics workloads. Performed SQL Server performance tuning by analyzing execution plans, adding strategic indexes, and restructuring queries to minimize latency for large-scale reporting workloads. Designed ETL workflows using SSIS to transform and load data from SQL Server, Access, and Excel into structured reporting models, ensuring analytics readiness and data consistency across business units. Consolidated data from heterogeneous systems using Informatica and SSIS, ensuring consistent data synchronization and supporting seamless business process reporting across departments. Authored entity-relationship diagrams and data lineage documentation, enabling full traceability of data dependencies for compliance, auditing, and regulatory reporting requirements. Designed dimensional data models including SCD Types I and II, star and snowflake schemas, and surrogate key logic to support scalable OLAP analytics and self-service reporting. Implemented data validation and error-handling frameworks with logging and checkpoints, ensuring data quality, rapid root cause analysis, and reliable report outputs for business stakeholders. Created and maintained SSAS multidimensional cubes with KPIs, aggregations, and partitioning to support rapid OLAP querying and deep-dive self-service analytics with sub-second response times. Scheduled and monitored reporting workflows using Airflow and Oozie, reducing manual intervention and ensuring timely, event-driven delivery of analytical reports. Built dynamic SSRS reports with drill-down, drill-through, and cascading parameters on SSAS cubes, delivering executive dashboards and operational reporting for real-time business monitoring. Environment: MS SQL Server, SSIS, SSAS, SSRS, Informatica, Power BI, DAX, Excel (Advanced), Tableau, Data Modeling, SharePoint, Git, Apache Airflow, Jira. Keywords: quality analyst business intelligence database rlang microsoft mississippi procedural language bay area California New Jersey |