Data professional with 5+ years of hands-on experience in Analytics, Data Engineering, and Reporting automation within FinTech, E-commerce, and Energy sectors.
Proven track record of building analytical solutions from scratch, automating key reporting, and developing B2B analytical services using dbt, Airflow, PostgreSQL, and GreenPlum.
Expert in designing ETL/ELT processes and data marts with complex business logic.
Experienced in BI development, migrating BI stacks (from Looker Studio to open-source BI tool), and implementing Data Quality processes.

pvl_ko
explain.analyze@gmail.com

Summary

  • 6 years in Analytics and 10 years in Reporting
  • Analytics Engineer & Data Warehouse System Analyst
  • Extensive experience in DWH, ELT, BI
  • Proficient in Data Modeling, Data Quality and Data Governance
  • Experience in fintech, e-commerce, energy domains
  • Team leadership and mentoring experience

Professional Experience

Analytics Engineer / Data Warehouse System Analyst

AVO Bank

11.2024 - Present

Remote

AVO is an online-first bank accessible via smartphone. It achieved impressive results in its first year and sets ambitious goals for the future in Uzbekistan's fintech market.
The Data Warehouse is the bank's central data repository, consolidating all the data the business needs for decision-making: from marketing communication data to optimizing risk assessment models for credit approvals.

- Designed and delivered several end-to-end analytical solutions for newly launched banking products, providing executive stakeholders with frameworks to evaluate performance and refine post-launch strategies.
- Implemented an event-based Activity Schema combined with a schemaless approach to centralize key user events, eliminating logic duplication and achieving hourly data delivery (latency < 1h) while significantly reducing time-to-data for stakeholders.
- Collaborated with stakeholders to develop a client segmentation methodology, creating the data foundation for personalized marketing campaigns and enhanced customer engagement.
- Engineered robust data transformation pipelines and a standardized semantic layer using dbt, incorporating automated Data Quality controls to increase business trust and ensure 100% reliable reporting.
- Initiated and executed the refactoring of critical data pipelines, improving SLAs and reducing "time-to-data" through query performance tuning and resolving system bottlenecks.
- Shortened onboarding time and improved team efficiency by mentoring new hires and documenting core processes, workflows, and best practices.

SQL dbt Data Modeling Data Quality Data Governance ELT Data Warehouse (DWH) git Reporting & Analysis

Analytics Engineer / Business Intelligence Engineer

omni.sale

2023 - 11.2024

Remote

A SaaS startup in the e-commerce domain that enables sellers to optimize their operations across marketplaces through automation tools.

- Developed a comprehensive reporting ecosystem for financial, marketing, and inventory analytics, integrating complex calculation logic for 10+ core KPIs (GMV, CAC, ROMI, AOV, etc.) to drive data-informed decision-making.
- Built and scaled the DWH semantic layer, establishing unified business definitions and standardizing KPIs across all analytical domains to ensure 100% consistency between the warehouse and BI reports.
- Implemented automated Data Quality (DQ) controls and S2T mappings for marketplace API integrations, significantly increasing the reliability of sales and inventory data.
- Led the migration from Looker Studio to an open-source BI platform, reducing reliance on SaaS providers and enhancing the flexibility of data visualizations and access control.
- Designed and maintained scalable ETL processes and data marts, providing stakeholders with real-time visibility into developer workloads and operational efficiency.

SQL dbt Data Modeling Data Quality ELT Data Warehouse (DWH) Business Intelligence git Reporting & Analysis

Data Warehouse Business Analyst

Samokat

2023

Remote

Event Warehouse is a centralized data repository product being developed for Samokat, a global leader in the Q-commerce industry, operating the world’s largest network of 2400+ dark stores and processing over 250M orders annually.
It captures user actions and interactions within the mobile app and website, enabling the marketing team to deliver personalized product recommendations, driving higher conversion rates and customer satisfaction.

- Designed a data-driven anti-fraud solution to identify and block fraudulent in-app advertising traffic, resulting in a 20% increase in ROMI and a 15% reduction in infrastructure load.
- Developed an automated order reconciliation model to sync actual order sizes with advertising partner data, leading to a $100K monthly reduction in customer acquisition costs (CAC).

Problem Solving Requirements Analysis BPMN Data Warehouse (DWH) Business Analysis

Business Data Analyst

Inter RAO

2016 - 2022

Moscow, Russia (Hybrid)

Inter RAO Engineering is the Engineering Division of Inter RAO, the largest public diversified energy company in Eastern Europe by retail customer base (~20M) and a top-tier power producer with ~31 GW capacity.

- Developed high-performance ETL pipelines and database schemas to automate data ingestion, validation, and transformation from multiple sources, resulting in a multiple-fold reduction in turnaround time for recurring data requests.
- Designed and implemented a standardized data framework for regulatory reporting, ensuring 100% accuracy and compliance for high-stakes submissions to the Ministry of Energy and Corporate Headquarters.
- Optimized the ad-hoc reporting ecosystem, boosting the "on-time completion" metric to 98% by implementing automated validation checks and reorganizing data workflows.
- Automated the IT budgeting and forecasting process, migrating the department from manual spreadsheets to an automated system, which eliminated human error and ensured real-time data consistency.
- Managed and optimized an IT budget of $1.5M+, achieving a 10% annual cost reduction by conducting data-driven audits of vendor contracts and consolidating procurement.
- Served as a primary link between business and tech teams, translating complex stakeholder requirements into clear technical specifications for analytics product development.

MySQL PostgreSQL Problem Solving Prefect Dataflow Automation Reporting & Analysis Requirements Analysis SQL Qlik Sense Python

Skills

Technical Skills

PostgreSQL GreenPlum MySQL dbt Airflow Data Warehouse (DWH) Extract, Load, Transform (ELT) Data Build Tool (dbt) Data Modeling Business Intelligence (BI) Docker git

Domain Expertise

E-Commerce Fintech Energy Sector

Education

Digital Industrial Transformation

Peter the Great St.Petersburg Polytechnic University

Focus on digital transformation and industrial systems

Continuing Education

Russia

IT Management

Higher School of Economics

Professional development in IT management practices

Professional Development

Russia

Mechanical Engineering

Tyumen State Oil & Gas University

Engineer’s Degree

Higher Education

Russia

Certifications

Big Data Fundamentals with PySpark

DataCamp

Credential ID: 28089656

Cleaning Data with PySpark

DataCamp

Credential ID: 28089733

Best printed with Google Chrome