Basic Information

Ref Number

Req_00078029

Last day to apply

04-Aug-2022

Primary Location

NSEZ

Country

India

Job Type

Digital Solutions

Work Style

Combined

Description and Requirements

  • Must have at least 3-12 years of experience working as a GCP Data Engineer. 

  • Build a framework using Google cloud technologies to collect, transfer, and archive data files of varying formats into a data lake and content warehouse in a consistent way.

  • Python with GCP APIs like BigQuery, GCS, and Data Flow Big Data.

  • Knowledge of RDBMS, well versed with SQL, will be required for source data analysis/ extraction.

  • Take advantage of new cloud services as they become available to further streamline data transport.

  • Take full advantage of Google technologies to build a resilient, cost-effective data transport solution.

  • Provide operations and maintenance support for data feeds and developed solutions.

  • Automate deployment of solution and infrastructure.

Additional Job Description

? Min 4+ years of experience where 2+ years of experience required in GCP ? Build a framework using Google cloud technologies to collect, transfer, and archive data files of varying formats into a data lake and content warehouse in a consistent way. ? Strong knowledge of Python with GCP APIs liks BigQuery, GCS and Dataflow and general bash scripts ? Good knowledge of NoSQL databases like MongoDB/Cassandra/HBase. ? Good knowledge of Google Big Query, Cloud Dataflow and Cloud Composer. ? Knowledge of using Git as version repository tool. ? Share ideas, suggestions and collaborate with Datalake team ? Strong verbal and written communications skills are a must. ? Knowledge of RDBMS, well versed with SQL, will be required for source data analysis/ extraction. ? Take advantage of new cloud services as they become available to further streamline data transport. ? Take full advantage of Google technologies to build a resilient, cost effective data transport solution. ? Provide operations and maintenance support for data feeds and developed solutions. ? Automate deployment of solution and infrastructure. ? Candidate should have hands on experience on following services: ? o Google BigQuery ? o Google Storage ? o Google Dataflow ? o DataProc ? o Python ? o PubSub ? o Airflow/Composer ? o CloudSQL ? o Cloud Build ? o Terraform Roles: ? Work with client teams to design and implement modern, scalable data solutions using a range of new and emerging technologies from the Google Cloud Platform. ? Work with Agile and DevOps techniques and implementation approaches in the delivery. ? Be required to showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions. ? Be required to build and deliver Data solutions using GCP products and offerings.