Description: |
You are expected to bring the DevOps mindset to enable big data and batch\real time analytical solution that leverage emerging technologies. You will develop prototypes and proof of concept for the selected solutions, and implement complex big data projects. You will apply a creative mindset to a focus on collecting, parsing, managing and automating data feedback loops in support of business innovation.
Must have
Hands-on experience on Google Cloud Data Platform
3+ years of experience with implementation, migrations, and upgrades in a GCP environment
Able to work with technical and operational subject matter experts to build data pipelines, and understand reports, controls, data movement, and business terminology
Programming skills (Pyspark/Python)
Working Knowledge with Hive, Jenkins, SQL, Hadoop, Spark
Used Google Data Product tools (e.g. BigQuery, Dataflow, Dataproc, GCP data warehouse)
Good to have Skills:
Real time experience with: Kafka, RabbitIMQ, Azure Events, Spark Steaming etc
Experience with data processing tools: (Apache Flink, Beam, Apache Nifi)
Develop Architecture and Solution Design (API design)
Experience with Kubernetes and Dockers
Experience with Microservices design and implementation.
Experience with CI\CD
Preferred Technical and Professional Expertise
Holding a GCP Certificate (GCP Data Engineer, GCP associate.)
Education:
Bachelors Degree in Computer Science or related field
|