17 Sep
IT firm
New Delhi
Responsibilities :
Design, build, and maintain data pipelines and ETL workflows on Google Cloud Platform.
Work with BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage to enable scalable data solutions.
Develop and optimize data models, transformations, and analytics layers.
Write efficient Python/SQL scripts for data processing and automation.
Collaborate with cross-functional teams to understand data requirements and deliver solutions.
Implement best practices for performance, scalability, security, and cost optimization.
Support data migration and integration from traditional databases to BigQuery.
Work with CI/CD pipelines, Docker, and Kubernetes for deployment.
Requirements :
5+ years of hands-on experience in Data Engineering
Strong expertise in Google Cloud Platform (GCP) services BigQuery, Dataflow, Pub/Sub, Dataproc
Proficiency in Python and SQL for large-scale data processing
Solid experience with ETL pipelines, data modeling, and performance tuning
Familiarity with Airflow/Cloud Composer for orchestration
Experience with Docker, Kubernetes, Terraform, CI/CD is a plus
Excellent problem-solving and communication skills
(ref:hirist.tech)
Impress this employer describing Your skills and abilities, fill out the form below and leave Your personal touch in the presentation letter.