06 Apr
|
Canopus Infosystems - A CMMI Level 3
|
New Delhi
06 Apr
Canopus Infosystems - A CMMI Level 3
New Delhi
Apply on Kit Job: kitjob.in/job/46ydpe
Job Title
n
Data Engineer (Analyst)nExperience
n
2.5 to 5 YearsnLocation:
n
PAN India (Remote/In office as applicable)
n
About the Role:nWe are looking for
n
Data Engineer (Analyst)
n
to build and maintain reliable data pipelines and analytics-ready datasets that power BI reporting, product insights, and business decision-making. You’ll work across multiple data sources, model clean reporting layers, and ensure data quality end-to-end.
n
Key Responsibilities:nBuild and maintain scalable
n
ETL/ELT pipelines
n
(batch and incremental) using
n
SQL + PythonnIntegrate data from databases, APIs, SaaS tools, event data, and flat filesnDesign analytics-ready data models
n
(star schema/marts) for self-serve reportingnCreate and optimize transformations in a cloud warehouse/lakehouse (e.g.,
n
Snowflake, BigQuery, Redshift, Synapse, Databricks )nPartner with stakeholders to define
n
KPIs, metric logic, and reporting requirementsnMaintain dashboards and reporting outputs in tools like
n
Power BI, Tableau, Looker, or SigmanImplement data quality checks , monitoring, alerts,
and documentation to keep datasets trustednTune performance and cost (incremental loads, partitioning, query optimization, file formats)
n
Required Skills:nStrong
n
SQL skills (CTEs, window functions, joins, aggregations, optimization)nStrong
n
Python skills for transformations and automationnHands-on experience with at least one cloud platform: AWS / Azure / GCPnExperience with a up-to-date data warehouse/lakehouse
n
(Snowflake/BigQuery/Redshift/Synapse/Databricks)nSolid understanding of
n
ETL/ELT patterns (incremental loads, retries, idempotency, basic CDC)nComfort with data modeling for analytics and BI reportingnExperience building stakeholder-friendly reporting in a
n
BI tool
n
(Power BI/Tableau/Looker/Sigma)
n
Nice to have:nOrchestration tools:
n
Airflow, dbt, Dagster, Prefect, ADF, Glue , etc.nStreaming/event data:
n
Kafka, Kinesis, Pub/SubnMonitoring/logging:
n
CloudWatch, Azure Monitor, GCP Monitoring, DatadognCI/CD + Git-based workflows for data pipelines
Apply on Kit Job: kitjob.in/job/46ydpe
📌 Data Engineer Analyst New Delhi
🏢 Canopus Infosystems - A CMMI Level 3
📍 New Delhi