We will notify you once we have something for you
HuQuo
3 days ago
Interesting Opportunity GCP Data Engineer - Spark/Hive
HuQuo
5 to 9 Yrs
- Regular
- Job Details
Job Details
- Job Description
Must-Have :
5+ Years of Experience in Data Engineering and building and maintaining large-scale data pipelines.
Experience with designing and implementing a large-scale Data-Lake on Cloud Infrastructure
Strong technical expertise in Python and SQL
Extremely well-versed in Google Compute Platform including BigQuery, Cloud Storage, Cloud Composer, DataProc, Dataflow, Pub/Sub.
Experience with Big Data Tools such as Hadoop and Apache Spark (Pyspark)
Experience Developing DAGs in Apache Airflow 1.10.x or 2. x
Good Problem-Solving Skills
Detail Oriented
Strong Analytical skills working with a large store of Databases and Tables
Ability to work with geographically diverse teams.
Good To Have
Certification in GCP service.
Experience with Kubernetes.
Experience with Docker
Experience with CircleCI for Deployment
Experience with Great Expectations.
Responsibilities
Build Data and ETL pipelines in GCP.
Support migration of data to the cloud. using Big Data Technologies like Spark, Hive, Talend, Java
Interact with customers on daily basis to ensure smooth engagement.
Responsible for timely and quality deliveries.
Fulfill organization responsibilities : Sharing knowledge and experience with the other groups in the organization, and conducting various technical training sessions.
Location : Pune, Hyderabad, Remote
Education : Bachelors or Masters (preferably BE/B.Tech) - Computer Science/IT.
(ref:hirist.tech,
Other Details
- Industry IT Services & Consulting
- Recruiter Details HuQuo
- Job Tags spark, apache spark, hive, bigquery, pyspark
- Job Type Full time
Key Skills
Recruiter Details
- HuQuo
- Other Maharashtra
- hidden_email
- hidden_mobile
Company Details
HuQuo