Home

Amith - data analyst
[email protected]
Location: Saint Louis, Missouri, USA
Relocation:
Visa:
Resume file: AMITH RAJ_DA_Resume_1770835446058.docx
Please check the file(s) for viruses. Files are checked manually and then made available for download.
AMITH RAJ
Data Analyst
Contact: (636) 368-2912
G-Mail:[email protected]
PROFESSIONAL SUMMARY:
Data Analyst Professional with 9 years of experience designing enterprise-scale BI solutions, building modern data pipelines, and delivering actionable insights across clinical, financial, and operational domains. Expert in Power BI, Tableau, SQL, Python, Snowflake, Databricks, and Azure, with a proven track record of transforming complex data into strategic intelligence for executive leadership.
Adept at developing robust data models, optimizing ETL/ELT workflows, and implementing scalable cloud data architectures that improve reporting accuracy, operational efficiency, and decision-making. Skilled collaborator who partners with finance, operations, and technology teams to gather requirements, modernize data ecosystems, and drive data governance and automation initiatives. Seeking to leverage deep technical expertise, strong business acumen, and leadership capabilities to advance data strategy and analytics maturity within a forward-thinking organization
Designed, developed, and maintained data visualizations and analytics solutions using Power BI, Tableau, and Looker Studio, enabling executives and cross-functional teams to monitor performance, financial health, and operational KPIs.
Built scalable SQL- and Python-based ETL/ELT pipelines across Snowflake, Databricks, and cloud platforms (Azure, AWS), ensuring reliable, high-quality data for reporting and analytics.
Created optimized data models, DAX calculations, LOD expressions, and calculated fields to support advanced analytical requirements and self-service reporting.
Developed end-to-end data warehouse solutions including schema design, Snowflake ingestion (Snowpipe), query optimization, and performance tuning for large-scale data workloads.
Partnered with CFOs, finance teams, clinical leaders, and operations stakeholders to gather requirements, validate financial metrics, reconcile discrepancies, and deliver actionable insights.
Implemented data governance, validation rules, version control (Git), and documentation standards to ensure accuracy, consistency, and long-term maintainability of data assets.
Automated reporting workflows using SQL, Python, Azure Data Factory, and Databricks, reducing manual effort and improving reporting efficiency and data freshness.
Integrated multiple financial and operational systems (Sage, QuickBooks, NetSuite, EMRs) into Snowflake to support unified enterprise reporting.
Built and orchestrated workflows using Airflow, Control-M, and Databricks Jobs to support enterprise-scale data operations and dependable pipeline execution.
Ensured secure and compliant analytics environments using Row-Level Security, object-level security, and role-based access controls across Power BI, Tableau, and AWS QuickSight.
Delivered ad-hoc analyses, financial analytics (AR aging, revenue cycle, payment trends), and competitive insights to support strategic business decisions.
Trained business users on dashboards, reporting tools, and data best practices to promote data literacy and self-service analytics adoption.
Collaborated in Agile environments with data engineering, analytics, and business teams to enhance data architecture, improve pipeline performance, and support enterprise data strategy.
Managed code repositories with Git/GitHub, implemented CI/CD pipelines, and contributed to standards for collaborative BI and data engineering development.

TECHNICAL SKILLS:
Business Intelligence & Visualization
Power BI (Desktop & Service): Data modeling, DAX, Power Query M, RLS/OLS, custom connectors, performance optimization, Power BI Embedded.
Tableau (Desktop & Server): Dashboard development, LOD expressions, table calculations, data blending, Tableau Prep, performance tuning.
Looker Studio: Report development, data modeling, performance optimization.
Data Engineering & ETL
ETL Tools: Azure Data Factory, Azure Databricks, Azure Synapse, Airflow, Kafka, Sqoop, Flume.
Big Data Technologies: Spark (SQL, Streaming), Hive (Partitioning, Bucketing), HDFS, Redshift, Teradata.
Programming & Scripting
Languages: SQL, Python (data automation, analytics), R (statistical modeling), Java (Spark transformations).
Scripting: Shell scripting, automation scripts for data workflows.
Databases
RDBMS: SQL Server, Oracle, MySQL, PostgreSQL.
Cloud Databases: Azure SQL, AWS Redshift.
Big Data Storage: Hive tables, HDFS.

PROFESSIONAL EXPERIENCE:
CENTENE CORPORATION, USA||Mar 2023 Till Date
Role: Data Analyst
Responsibilities:
Developed and maintain Power BI dashboards that align clinical and financial KPIs, enabling executives to monitor performance, revenue cycle health, and operational efficiency.
Partnered closely with the CFO and finance team to validate financial metrics, reconcile reporting discrepancies, and ensure accuracy and transparency of financial statements.
Designed robust data models and DAX calculations in Power BI to support scalable reporting across operational, and financial domains.
Built and optimize SQL-based data pipelines and transformations to support reliable reporting in Snowflake and downstream analytics tools.
Implemented rigorous data quality checks, documentation standards, and version control for SQL queries, ETL workflows, and published reports.
Recommend improvements to data architecture and contribute to the long-term enterprise data strategy, including schema design, integration patterns, and governance processes.
Integrated financial systems (Sage, QuickBooks, NetSuite) into the Snowflake data warehouse to unify financial, operational, and clinical reporting.
Collaborated with cross-functional stakeholders finance, operations, clinical leadership to gather requirements, translate business needs into technical solutions, and deliver actionable insights.
Analyzed and interpret financial metrics, including AR aging, payment trends, margin performance, and revenue cycle KPIs, to support decision-making.
Developed Python and SQL scripts for advanced data transformations, automation, and quality assurance.
Operated effectively in a fast-paced, PE-backed environment, prioritizing high-impact analytics and driving continuous improvement in reporting capabilities
Created interactive dashboards and reports in Tableau, providing stakeholders with real-time data visualizations and performance metrics.
Provided ad-hoc data analysis to support strategic decision-making for senior management.
Trained team members and business users on how to effectively use Tableau dashboards for data-driven decisions.
Assisted in the creation of detailed business requirements documents (BRD) and functional specifications.
Extracted, cleansed, and analyzed data from various sources using SQL queries to support business initiatives.
Built and maintained Tableau dashboards and reports to track KPIs, sales performance, and customer Behavior.
Conducted root cause analysis and collaborated with IT to implement data solutions that improved data accuracy and business insights.
Supported senior analysts in preparing presentations for stakeholders, translating technical information into easily understandable business terms.
Power BI Service Proficient in publishing and administering reports and dashboards on the Power BI Service. Capable of configuring security settings, workspaces, and facilitating report sharing with stakeholders.
Accomplished in optimizing Power BI solutions to enhance performance by reducing query load times, improving report rendering, and streamlining data refresh processes.
Custom Development Proficient in custom development tasks such as crafting custom connectors, Power BI templates (PBIT), and implementing Power BI Embedded solutions where necessary.
Working Knowledgeable in implementing robust data security measures in Power BI, including Row-Level Security (RLS) and object-level security, to ensure data privacy and compliance.
Proficient in SQL for efficient data querying, manipulation, and management, complementing Power BI development capabilities.
Experienced in utilizing version control systems like Git for effective project management and seamless collaboration with fellow developers.
Adept at understanding and translating complex business requirements into actionable insights within Power BI, facilitating informed decision-making.
Excellent communication skills, facilitating effective interaction with business stakeholders, including requirements gathering and insights presentation from Power BI reports.
Strong programming skills in Python for building automations and data extraction, strong exposure in data integration with third party tools.
Environment: Python, GCP, Data Lake, Azure Data Factory, Data Bricks, SQL Server, Informatica (Extract, Transform, Load), Power BI, Looker Studio & Tableau.

Client: WEBSTER UNIVERSITY, ST LOUIS, USA|| Aug 2021- Feb 2023
Role: Data Analyst
Roles and Responsibilities:
Involved in creating different reports based on business requests and ad hoc requirements using Tableau desktop. Prepare financial deliverables (quarterly, accrual and budgetary reports) for stakeholders to illustrate KPI metrics.
Highlight your proficiency in designing and developing interactive and insightful Tableau dashboards and reports. Mention your experience in creating calculated fields, sets, and groups.
Data Visualization skills in creating compelling data visualizations using Tableau's various chart types, maps, and interactive features.
Gathered business requirements from stakeholders across various departments to create detailed functional and technical specifications.
Designed and implemented SQL queries to retrieve, manipulate, and aggregate data, facilitating the creation of ad-hoc and regular reports.
Developed and maintained interactive Tableau dashboards to visualize key metrics such as sales performance, operational efficiency, and customer Behavior.
Automated reporting processes, reducing manual reporting efforts through the use of Tableau and SQL scripts.
Provided actionable recommendations based on data trends, enabling informed decision-making for business leaders and cross-functional teams.
Assisted in gathering and analyzing business requirements, working closely with both IT and business teams to define and prioritize project deliverables.
Wrote SQL queries to extract and clean data for business reporting, ensuring high data quality and integrity.
Developed Tableau dashboards to monitor key business metrics, including financial performance, product lifecycle, and customer satisfaction.
Conducted ad-hoc data analysis to support business initiatives and identify opportunities for process improvements.
Assisted in the creation of business reports and presentations, translating complex data into clear, understandable insights for management and stakeholders.
Work ability to connect Tableau to various data sources, including databases (SQL Server, Oracle), cloud platforms (AWS, Azure), spreadsheets, and web data connectors.
Skilled in data preparation and cleansing using Tableau Prep or other ETL tools. Discuss how you've handled complex data transformations and data integration tasks.
Worked on SQL and Database Include your proficiency in SQL for writing custom queries, joining tables, and performing data manipulations.
Data Modelling experience in creating effective data models within Tableau, including building relationships, hierarchies, and aggregations to support business requirements.
Version Control experience on using version control systems (e.g., Git) for managing Tableau projects, collaborating with other developers, and maintaining a history of changes.
Ability to understand and translate business requirements into effective Tableau solutions that provide actionable insights for stakeholders.
Effective communication skills, as effective communication with business users is essential for gathering requirements and explaining insights derived from Tableau reports.
Environment: SQL Server, Teradata, MySQL, Data Warehouse, ETL, Looker Studio, Tableau & Power BI.

Client: MANKIND PHARMA.LTD, HYDERABAD, India|| Oct 2019 - Jan 2021
Role: Data Analyst
Roles and Responsibilities:
Developed and optimized complex SQL queries for large-scale data extraction, transformation, and reporting across multiple business domains.
Collaborated closely with business stakeholders to gather, analyze, and translate functional requirements into scalable technical solutions.
Designed, built, and maintained modular, testable data models and transformation workflows using DBT, ensuring consistent data pipelines and documentation.
Scheduled, monitored, and troubleshot enterprise batch workflows using Control-M, improving job reliability and reducing downtime.
Ensured data accuracy, consistency, and integrity across systems through rigorous validation, data quality checks, and governance best practices.
Created comprehensive documentation for data flows, business rules, transformation logic, and technical specifications to support team knowledge sharing.
Supported data-driven decision making by delivering high-quality datasets, dashboards, and analytical assets to cross-functional teams.
Collaborated with engineering, analytics, and business teams in an Agile environment to improve data processes and pipeline performance.
Utilized version control tools like Git to manage code changes, enforce standards, and support collaborative development.
Worked with modern cloud data platforms (e.g., Snowflake/BigQuery/Redshift) to enhance scalability and enable advanced analytics capabilities.
Build Docker Images to run airflow on local environment to test the Ingestion as well as ETL pipelines.
Developed complex SQL queries, stored procedures, and views in Snowflake to support analytical and reporting needs.
Designed and optimized Snowflake schemas (star/snowflake) for scalable data warehousing solutions.
Implemented Snowpipe, file ingestion workflows, and automated data loading from cloud storage (AWS S3/Azure).
Tuned query performance using clustering, micro-partitioning analysis, and warehouse optimization.
Managed Snowflake roles, access controls, and resource monitoring for secure data operations.
Built ETL/ELT pipelines using Python to automate data extraction, transformation, and loading processes.
Created reusable Python modules for data cleansing, validation, and enrichment.
Integrated APIs and third-party data sources into the data ecosystem using Python.
Developed automation scripts for monitoring data quality and pipeline health.
Utilized Python libraries such as Pandas, PySpark, NumPy, and SQLAlchemy for data manipulation.
Designed and developed scalable ETL pipelines using PySpark in Databricks notebooks.
Optimized large-scale data processing workloads through efficient Spark transformations.
Managed Delta Lake tables for ACID-compliant data lake operations.
Scheduled and orchestrated workflows using Databricks Jobs and Workflow pipelines.
Collaborated with data engineers and analysts through shared workspaces and versioned notebooks.
Managed version control of code repositories using Git and GitHub, including branching, merging, and pull requests.
Implemented CI/CD pipelines through GitHub Actions for automated testing and deployment.
Collaborated with team members using issue tracking, code reviews, and project boards.
Ensured repository documentation, coding standards, and workflow consistency.
Designed interactive dashboards and visualizations in Tableau for business reporting and KPIs.
Created calculated fields, parameters, and LOD expressions to support complex analytical logic.
Integrated Tableau with Snowflake, databases, and cloud data sources.
Optimized dashboard performance and published workbooks to Tableau Server/Online.
Collaborated with business users to gather requirements and deliver actionable insights.
Built dynamic dashboards and data visualizations in Amazon QuickSight for operational and executive reporting.
Created datasets, custom fields, and calculated metrics within QuickSight analyses.
Integrated QuickSight with AWS data sources (S3, Athena, Redshift, RDS).
Implemented row-level security and access control for secure reporting.
Automated dashboard refreshes and managed SPICE capacity for efficient performance
Utilized Redshift to storing the processed records and implemented batch scripts for continuous learning.
Writes SQL Stored Procedures and Views and performs in-depth testing of new and existing systems.
Manipulate and prepare data, extract data from database for business analyst using Tableau.
Review normalized schemas for effective and optimum performance tuning queries and data validations in OLTP and OLAP environments.
Exploits power of MS SQL to solve complex business problems by data analysis on a large set of data.
Environment: Tableau, Teradata, Scala, GCP, Python, Spark, Hive, NiFi, MySQL, Kafka, Shell Scripting, Cloudera, MongoDB, AWS.

Client: Signiant Health Hyderabad|| Oct 2016 - Sep 2019
Roles: Data Analyst
Roles and Responsibilities:
End-to-end ETL processes to extract, transform, and load data from diverse sources, ensuring high data accuracy, consistency, and reliability.
Designed and developed interactive dashboards and reports in Tableau, Power BI, and Looker Studio, significantly improving data accessibility and user engagement.
Partnered with cross-functional teams to gather business requirements and translate them into actionable data visualizations and reporting solutions.
Enhanced forecasting accuracy through advanced data analysis and process optimization.
Automated key reporting workflows using SQL and Python, reducing manual work by 30% and enabling real-time analytics.
Developed and optimized complex SQL queries, driving a 25% improvement in data-driven decision-making accuracy.
Conducted competitive landscape analysis that contributed to a 15% increase in market share.
Built executive-level Power BI visuals that reduced report preparation time by 40%.
Utilized advanced SQL and scripting to extract, transform, and manipulate data across multiple databases, improving data retrieval performance.
Implemented effective data modeling techniques to support scalable analytics and insight generation.
Performed rigorous data validation and quality checks, quickly resolving discrepancies to maintain data integrity.
Conducted performance tuning and optimization of Tableau, Power BI, and Looker Studio solutions for faster rendering and improved user experience.
Provided training and ongoing support to stakeholders, enabling efficient use of BI tools and promoting data literacy.
Collaborated with IT teams to maintain, upgrade, and optimize reporting tools; contributed to best-practice documentation for visualization and reporting standards.
Managed a portfolio of dashboards and reports, ensuring timely updates and accurate data representation.
Designed and deployed a comprehensive Tableau dashboard delivering real-time sales insights, resulting in a 20% revenue increase in Q1 post-deployment.
Played a key role in migrating the reporting environment to Looker Studio, improving reporting efficiency and reducing licensing costs by 10%.
Recognized for consistently delivering high-quality, insightful dashboards and reports, enabling data-driven decisions across the organization.
Environment: ETL, Tableau, Power BI, Looker Studio, SQL, Python, Data Modeling, Data Validation, Data Visualization, Performance Optimization, Requirements Gathering, Cross-Functional Collaboration, Documentation
Keywords: continuous integration continuous deployment business intelligence sthree active directory rlang information technology microsoft mississippi Arkansas

To remove this resume please click here or send an email from [email protected] to [email protected] with subject as "delete" (without inverted commas)
[email protected];6812
Enter the captcha code and we will send and email at [email protected]
with a link to edit / delete this resume
Captcha Image: