Basic Information
Ref Number
Last day to apply
Primary Location
Country
Job Type
Work Style
Description and Requirements
Excellent skills in writing complex SQLs.
Very good understanding of Data Migration & ETL processes.
Hands-on experience in GCP cloud data implementation suite such as BigQuery, Pub Sub, Data Flow/Apache Beam (Java/Python), Airflow/Composer, Cloud Storage, etc.
Must have Google Cloud BigQuery experience, including Datasets, Objects, IAM
Strong experience with relational and non-relational databases in Cloud with billions of records (structured unstructured data).
Ability to design develop data flow pipelines from scratch
Excellent problem-solving debugging skills Regulatory and Compliance work in Data Management
Showcase your GCP Data engineering experience when communicating with clients on their requirements, turning these into technical data solutions.
Identify internal/external data sources to design and implement table structure, data products, ETL
strategy, automation frameworks and scalable data pipelines.
Working knowledge of cloud architecture (Google Cloud Platform), Knowledge of relational databases (Microsoft SQL Server, MySQL, Oracle)
Working knowledge of RDBMS, GCP data pipeline, GCP storage, Python and SQL, DMS, GCP ETL,
Sqoop and similar ETL tools etc.
Must have worked in at least one ETL project.
Additional Job Description
Must have 4+ years of experience.
Working experience in developing data pipeline
Data Engineering experience Hands on and deep experience working with Google Data Products
Work with Agile and DevOps techniques and implementation approaches in the delivery
Build and deliver Data solutions using GCP products and offerings.
Communication should be good.
Willing to work till 2:30 PM EST
EEO Statement