05 Apr
|
Canopus Infosystems - A CMMI Level 3
|
New Delhi
05 Apr
Canopus Infosystems - A CMMI Level 3
New Delhi
Apply on Kit Job: kitjob.in/job/46mgrl
Job Title:
Data Engineer (Analyst) Experience:
2.5 to 5 Years Location:
PAN India (Remote/On-site as applicable)
About the Role: We are looking for
Data Engineer (Analyst)
to build and maintain reliable data pipelines and analytics-ready datasets that power BI reporting, product insights, and business decision-making. You’ll work across multiple data sources, model clean reporting layers, and ensure data quality end-to-end.
Key Responsibilities: Build and maintain scalable
ETL/ELT pipelines
(batch and incremental) using
SQL + Python Integrate data from
databases, APIs, SaaS tools, event data, and flat files Design analytics-ready
data models
(star schema/marts) for self-serve reporting Create and optimize transformations in a cloud warehouse/lakehouse (e.g.,
Snowflake, BigQuery, Redshift, Synapse, Databricks ) Partner with stakeholders to define
KPIs, metric logic, and reporting requirements Maintain dashboards and reporting outputs in tools like
Power BI, Tableau, Looker, or Sigma Implement
data quality checks , monitoring, alerts,
and documentation to keep datasets trusted Tune performance and cost (incremental loads, partitioning, query optimization, file formats)
Required Skills: Robust
SQL
skills (CTEs, window functions, joins, aggregations, optimization) Solid
Python
skills for transformations and automation Hands-on experience with
at least one cloud platform: AWS / Azure / GCP Experience with a up-to-date
data warehouse/lakehouse
(Snowflake/BigQuery/Redshift/Synapse/Databricks) Solid understanding of
ETL/ELT
patterns (incremental loads, retries, idempotency, basic CDC) Comfort with
data modeling
for analytics and BI reporting Experience building stakeholder-friendly reporting in a
BI tool
(Power BI/Tableau/Looker/Sigma)
Nice to have: Orchestration tools:
Airflow, dbt, Dagster, Prefect, ADF, Glue , etc. Streaming/event data:
Kafka, Kinesis, Pub/Sub Monitoring/logging:
CloudWatch, Azure Monitor, GCP Monitoring, Datadog CI/CD + Git-based workflows for data pipelines
Apply on Kit Job: kitjob.in/job/46mgrl
📌 Data Engineer Analyst New Delhi
🏢 Canopus Infosystems - A CMMI Level 3
📍 New Delhi