26 Mar
|
Tredence
|
Nellore
Apply on Kit Job: kitjob.in/job/43xign
Role - GCP Data Engineer with PySpark & Python expertise
n
Experience: 5+ Years Preferred - Data Engineering Background Location - Bangalore
n
Responsibilities
n
Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP
Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines.
Experience in some of the following: Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning.
Experience programming in Hadoop, PySpark/Python, SQL.
Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases.
Certified in Google Skilled Data Engineer/Solution Architect is a major Advantage.
Design, develop, and deploy scalable ETL/ELT pipelines using PySpark on GCP.
Utilize GCP services extensively, including BigQuery (data warehousing), Cloud Storage,
Dataproc and Dataflow.
Optimize PySpark jobs for performance and reliability, and fine-tune BigQuery queries.
Implement complex transformations and process large volumes of structured/unstructured data using Spark SQL and PySpark.
Build and manage automated workflows using Apache Airflow or Cloud Composer.
Robust proficiency in Python and SQL is essential.
Extensive hands-on experience with PySpark.
Proven experience with Google Cloud Platform (GCP) services.
n
Qualifications
n
Bachelor's or Master's degree in Computer Science, Engineering or a related field.
n
Required Skills
n
GCP DE Experience
Big Query
PySpark
Python
n
Location - Bangalore
Apply on Kit Job: kitjob.in/job/43xign
📌 Senior Data Engineer Nellore
🏢 Tredence
📍 Nellore