We are seeking a technically strong and business-savvy "ETL & BI Reporting Analyst". This role blends ETL development, workflow automation, BI reporting, and data mart design. You will work across business and technical teams to create reporting solutions that deliver accurate, timely, and actionable insights, particularly around key performance indicators. Details of the same have been highlighted below:
Key Responsibilities:
Build and maintain ETL pipelines to ingest, transform, and structure data from various software systems (POS, ERP, CRM, etc.)
Develop data marts tailored for reporting and analytics by transforming raw data from source systems and enterprise data warehouses
Collaborate with stakeholders to gather reporting needs and design semantic layers and data models
Automate ETL workflows using Apache Airflow for scheduled and dependency-driven jobs
Write Python scripts to extract data via APIs, handle complex transformations, or enrich datasets
Design and develop dashboards and reports using Power BI, Tableau, or Looker
Define, calculate, and monitor KPIs.
Ensure data consistency, lineage, and accuracy through validation and documentation
Contribute to data governance by documenting ETL processes, data marts, and KPI definitions to support metadata management, data lineage, and reporting consistency.
Required Skills and Qualifications:
3+ years of experience in ETL development, data pipeline automation, and BI reporting
Must have 3+ years of hands-on experience in reporting and analytics over massive-sized data warehouses
Strong hands-on experience in designing and building data marts for reporting use cases
Proficiency in Python for ETL scripting, automation, and data wrangling
Experience with Apache Airflow for scheduling and orchestrating data pipelines
Advanced SQL skills and understanding of relational databases
Expertise in BI tools (Power BI, Tableau, Looker)
Solid understanding of domain KPIs and reporting needs
Practical knowledge of data warehousing, star/snowflake schema, and dimensional modeling
Ability to translate business questions into data solutions and communicate effectively across teams
Preferred Qualifications:
Bachelor’s degree in Computer Science, Information Systems, or Data Analytics
Exposure to data services like Redshift, BigQuery, or Snowflake.
Experience with CI/CD, Git, and Data Versioning.
Prior work in agile/scrum environments.
Interested candidates are requested to send in their cover letter and resume clarifying their work experience and expected salary to hrd@ycotek.com by 28 January, 2026.
For further details and updates, Follow Ycotek on Linkedin
Interested candidates are requested to send their cover letter and resume, clarifying their work experience and expected salary to hrd@ycotek.com by January 27, 2026