25 Mar
|
Canopus Infosystems - A CMMI Level 3
|
Nellore
25 Mar
Canopus Infosystems - A CMMI Level 3
Nellore
Apply on Kit Job: kitjob.in/job/43jtwh
Job Title: Data Engineer (Analyst)
n
Experience: 2.5 to 5 Years
n
Location: PAN India (Remote/In office as applicable)
n
About the Role:
n
We are looking for Data Engineer (Analyst) to build and maintain reliable data pipelines and analytics-ready datasets that power BI reporting, product insights, and business decision-making. You’ll work across multiple data sources, model clean reporting layers, and ensure data quality end-to-end.
n
Key Responsibilities:
n
- Build and maintain scalable ETL/ELT pipelines (batch and incremental) using SQL + Python
- Integrate data from databases, APIs, SaaS tools, event data, and flat files
- Design analytics-ready data models (star schema/marts) for self-serve reporting
- Create and optimize transformations in a cloud warehouse/lakehouse (e.g., Snowflake, BigQuery, Redshift, Synapse, Databricks )
- Partner with stakeholders to define KPIs, metric logic, and reporting requirements
- Maintain dashboards and reporting outputs in tools like Power BI, Tableau, Looker, or Sigma
- Implement data quality checks , monitoring, alerts, and documentation to keep datasets trusted
- Tune performance and cost (incremental loads, partitioning, query optimization, file formats)
n
Required Skills:
n
- Strong SQL skills (CTEs, window functions, joins, aggregations, optimization)
- Strong Python skills for transformations and automation
- Hands-on experience with at least one cloud platform: AWS / Azure / GCP
- Experience with a modern data warehouse/lakehouse (Snowflake/BigQuery/Redshift/Synapse/Databricks)
- Solid understanding of ETL/ELT patterns (incremental loads, retries, idempotency, basic CDC)
- Comfort with data modeling for analytics and BI reporting
- Experience building stakeholder-friendly reporting in a BI tool (Power BI/Tableau/Looker/Sigma)
Nice to have: n
- Orchestration tools: Airflow, dbt, Dagster, Prefect, ADF, Glue , etc.
- Streaming/event data: Kafka, Kinesis, Pub/Sub
- Monitoring/logging: CloudWatch, Azure Monitor, GCP Monitoring, Datadog
- CI/CD + Git-based workflows for data pipelines
Apply on Kit Job: kitjob.in/job/43jtwh
📌 Data Engineer (Analyst) (Nellore)
🏢 Canopus Infosystems - A CMMI Level 3
📍 Nellore